8975 1727204028.27635: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8975 1727204028.28085: Added group all to inventory 8975 1727204028.28087: Added group ungrouped to inventory 8975 1727204028.28092: Group all now contains ungrouped 8975 1727204028.28095: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 8975 1727204028.48213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8975 1727204028.48262: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8975 1727204028.48283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8975 1727204028.48329: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8975 1727204028.48409: Loaded config def from plugin (inventory/script) 8975 1727204028.48412: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8975 1727204028.48445: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8975 1727204028.48540: Loaded config def from plugin (inventory/yaml) 8975 1727204028.48542: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8975 1727204028.48640: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8975 1727204028.49126: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8975 1727204028.49129: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8975 1727204028.49133: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8975 1727204028.49139: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8975 1727204028.49144: Loading data from /tmp/network-jrl/inventory-0Xx.yml 8975 1727204028.49211: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 8975 1727204028.49262: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8975 1727204028.49299: Loading data from /tmp/network-jrl/inventory-0Xx.yml 8975 1727204028.49371: group all already in inventory 8975 1727204028.49381: set inventory_file for managed-node1 8975 1727204028.49386: set inventory_dir for managed-node1 8975 1727204028.49387: Added host managed-node1 to inventory 8975 1727204028.49390: Added host managed-node1 to group all 8975 1727204028.49391: set ansible_host for managed-node1 8975 1727204028.49392: set ansible_ssh_extra_args for managed-node1 8975 1727204028.49397: set inventory_file for managed-node2 8975 1727204028.49399: set inventory_dir for managed-node2 8975 1727204028.49400: Added host managed-node2 to inventory 8975 1727204028.49401: Added host managed-node2 to group all 8975 1727204028.49402: set ansible_host for managed-node2 8975 1727204028.49402: set ansible_ssh_extra_args for managed-node2 8975 1727204028.49404: set inventory_file for managed-node3 8975 1727204028.49406: set inventory_dir for managed-node3 8975 1727204028.49406: Added host managed-node3 to inventory 8975 1727204028.49407: Added host managed-node3 to group all 8975 1727204028.49408: set ansible_host for managed-node3 8975 1727204028.49408: set ansible_ssh_extra_args for managed-node3 8975 1727204028.49410: Reconcile groups and hosts in inventory. 8975 1727204028.49414: Group ungrouped now contains managed-node1 8975 1727204028.49416: Group ungrouped now contains managed-node2 8975 1727204028.49417: Group ungrouped now contains managed-node3 8975 1727204028.49495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8975 1727204028.49630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8975 1727204028.49685: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8975 1727204028.49714: Loaded config def from plugin (vars/host_group_vars) 8975 1727204028.49717: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8975 1727204028.49741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8975 1727204028.49751: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8975 1727204028.49800: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8975 1727204028.50158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204028.50261: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8975 1727204028.50309: Loaded config def from plugin (connection/local) 8975 1727204028.50312: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8975 1727204028.50904: Loaded config def from plugin (connection/paramiko_ssh) 8975 1727204028.50908: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8975 1727204028.51586: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8975 1727204028.51617: Loaded config def from plugin (connection/psrp) 8975 1727204028.51619: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8975 1727204028.52104: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8975 1727204028.52134: Loaded config def from plugin (connection/ssh) 8975 1727204028.52136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8975 1727204028.54897: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8975 1727204028.54941: Loaded config def from plugin (connection/winrm) 8975 1727204028.54944: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8975 1727204028.54980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8975 1727204028.55047: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8975 1727204028.55092: Loaded config def from plugin (shell/cmd) 8975 1727204028.55094: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8975 1727204028.55113: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8975 1727204028.55160: Loaded config def from plugin (shell/powershell) 8975 1727204028.55162: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8975 1727204028.55204: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8975 1727204028.55323: Loaded config def from plugin (shell/sh) 8975 1727204028.55325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8975 1727204028.55352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8975 1727204028.55435: Loaded config def from plugin (become/runas) 8975 1727204028.55437: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8975 1727204028.55559: Loaded config def from plugin (become/su) 8975 1727204028.55561: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8975 1727204028.55670: Loaded config def from plugin (become/sudo) 8975 1727204028.55672: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8975 1727204028.55699: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 8975 1727204028.55940: in VariableManager get_vars() 8975 1727204028.55956: done with get_vars() 8975 1727204028.56058: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8975 1727204028.58186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8975 1727204028.58274: in VariableManager get_vars() 8975 1727204028.58278: done with get_vars() 8975 1727204028.58280: variable 'playbook_dir' from source: magic vars 8975 1727204028.58281: variable 'ansible_playbook_python' from source: magic vars 8975 1727204028.58281: variable 'ansible_config_file' from source: magic vars 8975 1727204028.58282: variable 'groups' from source: magic vars 8975 1727204028.58282: variable 'omit' from source: magic vars 8975 1727204028.58283: variable 'ansible_version' from source: magic vars 8975 1727204028.58283: variable 'ansible_check_mode' from source: magic vars 8975 1727204028.58284: variable 'ansible_diff_mode' from source: magic vars 8975 1727204028.58284: variable 'ansible_forks' from source: magic vars 8975 1727204028.58285: variable 'ansible_inventory_sources' from source: magic vars 8975 1727204028.58285: variable 'ansible_skip_tags' from source: magic vars 8975 1727204028.58285: variable 'ansible_limit' from source: magic vars 8975 1727204028.58286: variable 'ansible_run_tags' from source: magic vars 8975 1727204028.58286: variable 'ansible_verbosity' from source: magic vars 8975 1727204028.58316: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 8975 1727204028.58797: in VariableManager get_vars() 8975 1727204028.58811: done with get_vars() 8975 1727204028.58818: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 8975 1727204028.59488: in VariableManager get_vars() 8975 1727204028.59501: done with get_vars() 8975 1727204028.59508: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8975 1727204028.59590: in VariableManager get_vars() 8975 1727204028.59614: done with get_vars() 8975 1727204028.59710: in VariableManager get_vars() 8975 1727204028.59724: done with get_vars() 8975 1727204028.59731: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8975 1727204028.59783: in VariableManager get_vars() 8975 1727204028.59793: done with get_vars() 8975 1727204028.59999: in VariableManager get_vars() 8975 1727204028.60009: done with get_vars() 8975 1727204028.60013: variable 'omit' from source: magic vars 8975 1727204028.60028: variable 'omit' from source: magic vars 8975 1727204028.60052: in VariableManager get_vars() 8975 1727204028.60061: done with get_vars() 8975 1727204028.60096: in VariableManager get_vars() 8975 1727204028.60105: done with get_vars() 8975 1727204028.60133: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8975 1727204028.60285: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8975 1727204028.60374: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8975 1727204028.60806: in VariableManager get_vars() 8975 1727204028.60823: done with get_vars() 8975 1727204028.61129: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 8975 1727204028.61231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8975 1727204028.62347: in VariableManager get_vars() 8975 1727204028.62362: done with get_vars() 8975 1727204028.62371: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 8975 1727204028.62502: in VariableManager get_vars() 8975 1727204028.62516: done with get_vars() 8975 1727204028.62602: in VariableManager get_vars() 8975 1727204028.62614: done with get_vars() 8975 1727204028.62873: in VariableManager get_vars() 8975 1727204028.62887: done with get_vars() 8975 1727204028.62890: variable 'omit' from source: magic vars 8975 1727204028.62907: variable 'omit' from source: magic vars 8975 1727204028.62932: in VariableManager get_vars() 8975 1727204028.62942: done with get_vars() 8975 1727204028.62956: in VariableManager get_vars() 8975 1727204028.62967: done with get_vars() 8975 1727204028.62991: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 8975 1727204028.63060: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 8975 1727204028.64657: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 8975 1727204028.64916: in VariableManager get_vars() 8975 1727204028.64935: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8975 1727204028.66352: in VariableManager get_vars() 8975 1727204028.66371: done with get_vars() 8975 1727204028.66379: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 8975 1727204028.66724: in VariableManager get_vars() 8975 1727204028.66739: done with get_vars() 8975 1727204028.66783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8975 1727204028.66794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8975 1727204028.66972: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8975 1727204028.67082: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8975 1727204028.67084: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 8975 1727204028.67107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8975 1727204028.67125: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8975 1727204028.67236: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8975 1727204028.67279: Loaded config def from plugin (callback/default) 8975 1727204028.67281: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8975 1727204028.68108: Loaded config def from plugin (callback/junit) 8975 1727204028.68110: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8975 1727204028.68149: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8975 1727204028.68194: Loaded config def from plugin (callback/minimal) 8975 1727204028.68195: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8975 1727204028.68226: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8975 1727204028.68269: Loaded config def from plugin (callback/tree) 8975 1727204028.68271: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8975 1727204028.68357: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8975 1727204028.68359: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 8975 1727204028.68381: in VariableManager get_vars() 8975 1727204028.68393: done with get_vars() 8975 1727204028.68397: in VariableManager get_vars() 8975 1727204028.68403: done with get_vars() 8975 1727204028.68406: variable 'omit' from source: magic vars 8975 1727204028.68433: in VariableManager get_vars() 8975 1727204028.68444: done with get_vars() 8975 1727204028.68459: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 8975 1727204028.68882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8975 1727204028.68937: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8975 1727204028.68968: getting the remaining hosts for this loop 8975 1727204028.68969: done getting the remaining hosts for this loop 8975 1727204028.68972: getting the next task for host managed-node2 8975 1727204028.68975: done getting next task for host managed-node2 8975 1727204028.68976: ^ task is: TASK: Gathering Facts 8975 1727204028.68977: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204028.68979: getting variables 8975 1727204028.68980: in VariableManager get_vars() 8975 1727204028.68990: Calling all_inventory to load vars for managed-node2 8975 1727204028.68992: Calling groups_inventory to load vars for managed-node2 8975 1727204028.68994: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204028.69003: Calling all_plugins_play to load vars for managed-node2 8975 1727204028.69012: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204028.69014: Calling groups_plugins_play to load vars for managed-node2 8975 1727204028.69039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204028.69080: done with get_vars() 8975 1727204028.69085: done getting variables 8975 1727204028.69135: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Tuesday 24 September 2024 14:53:48 -0400 (0:00:00.008) 0:00:00.008 ***** 8975 1727204028.69153: entering _queue_task() for managed-node2/gather_facts 8975 1727204028.69154: Creating lock for gather_facts 8975 1727204028.69449: worker is 1 (out of 1 available) 8975 1727204028.69462: exiting _queue_task() for managed-node2/gather_facts 8975 1727204028.69478: done queuing things up, now waiting for results queue to drain 8975 1727204028.69480: waiting for pending results... 8975 1727204028.69629: running TaskExecutor() for managed-node2/TASK: Gathering Facts 8975 1727204028.69694: in run() - task 127b8e07-fff9-9356-306d-0000000000cd 8975 1727204028.69706: variable 'ansible_search_path' from source: unknown 8975 1727204028.69744: calling self._execute() 8975 1727204028.69798: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204028.69803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204028.69812: variable 'omit' from source: magic vars 8975 1727204028.69892: variable 'omit' from source: magic vars 8975 1727204028.69913: variable 'omit' from source: magic vars 8975 1727204028.69946: variable 'omit' from source: magic vars 8975 1727204028.69984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204028.70057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204028.70077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204028.70092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204028.70102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204028.70130: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204028.70133: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204028.70136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204028.70212: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204028.70216: Set connection var ansible_connection to ssh 8975 1727204028.70219: Set connection var ansible_shell_executable to /bin/sh 8975 1727204028.70227: Set connection var ansible_timeout to 10 8975 1727204028.70230: Set connection var ansible_shell_type to sh 8975 1727204028.70239: Set connection var ansible_pipelining to False 8975 1727204028.70261: variable 'ansible_shell_executable' from source: unknown 8975 1727204028.70264: variable 'ansible_connection' from source: unknown 8975 1727204028.70268: variable 'ansible_module_compression' from source: unknown 8975 1727204028.70270: variable 'ansible_shell_type' from source: unknown 8975 1727204028.70273: variable 'ansible_shell_executable' from source: unknown 8975 1727204028.70275: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204028.70278: variable 'ansible_pipelining' from source: unknown 8975 1727204028.70280: variable 'ansible_timeout' from source: unknown 8975 1727204028.70285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204028.70432: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204028.70441: variable 'omit' from source: magic vars 8975 1727204028.70445: starting attempt loop 8975 1727204028.70447: running the handler 8975 1727204028.70462: variable 'ansible_facts' from source: unknown 8975 1727204028.70482: _low_level_execute_command(): starting 8975 1727204028.70486: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204028.71051: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204028.71056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204028.71058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204028.71106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204028.71110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204028.71112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204028.71197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204028.72977: stdout chunk (state=3): >>>/root <<< 8975 1727204028.73088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204028.73159: stderr chunk (state=3): >>><<< 8975 1727204028.73162: stdout chunk (state=3): >>><<< 8975 1727204028.73182: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204028.73196: _low_level_execute_command(): starting 8975 1727204028.73203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379 `" && echo ansible-tmp-1727204028.7318208-9045-244655295830379="` echo /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379 `" ) && sleep 0' 8975 1727204028.73694: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204028.73698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204028.73701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204028.73710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204028.73762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204028.73772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204028.73775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204028.73842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204028.75890: stdout chunk (state=3): >>>ansible-tmp-1727204028.7318208-9045-244655295830379=/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379 <<< 8975 1727204028.76005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204028.76062: stderr chunk (state=3): >>><<< 8975 1727204028.76067: stdout chunk (state=3): >>><<< 8975 1727204028.76083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204028.7318208-9045-244655295830379=/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204028.76116: variable 'ansible_module_compression' from source: unknown 8975 1727204028.76166: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8975 1727204028.76170: ANSIBALLZ: Acquiring lock 8975 1727204028.76173: ANSIBALLZ: Lock acquired: 140501807209920 8975 1727204028.76175: ANSIBALLZ: Creating module 8975 1727204029.15305: ANSIBALLZ: Writing module into payload 8975 1727204029.15437: ANSIBALLZ: Writing module 8975 1727204029.15616: ANSIBALLZ: Renaming module 8975 1727204029.15620: ANSIBALLZ: Done creating module 8975 1727204029.15622: variable 'ansible_facts' from source: unknown 8975 1727204029.15625: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204029.15627: _low_level_execute_command(): starting 8975 1727204029.15630: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 8975 1727204029.16206: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204029.16225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204029.16240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.16260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204029.16280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204029.16293: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204029.16307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.16385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.16419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204029.16437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204029.16460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204029.16570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204029.18374: stdout chunk (state=3): >>>PLATFORM <<< 8975 1727204029.18444: stdout chunk (state=3): >>>Linux <<< 8975 1727204029.18481: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 8975 1727204029.18623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204029.18691: stderr chunk (state=3): >>><<< 8975 1727204029.18694: stdout chunk (state=3): >>><<< 8975 1727204029.18711: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204029.18723 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 8975 1727204029.18767: _low_level_execute_command(): starting 8975 1727204029.18771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8975 1727204029.18859: Sending initial data 8975 1727204029.18862: Sent initial data (1181 bytes) 8975 1727204029.19258: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.19264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.19272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204029.19283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204029.19286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.19327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204029.19330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204029.19335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204029.19408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204029.23060: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 8975 1727204029.23459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204029.23525: stderr chunk (state=3): >>><<< 8975 1727204029.23529: stdout chunk (state=3): >>><<< 8975 1727204029.23539: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204029.23615: variable 'ansible_facts' from source: unknown 8975 1727204029.23619: variable 'ansible_facts' from source: unknown 8975 1727204029.23627: variable 'ansible_module_compression' from source: unknown 8975 1727204029.23659: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8975 1727204029.23689: variable 'ansible_facts' from source: unknown 8975 1727204029.23811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py 8975 1727204029.23936: Sending initial data 8975 1727204029.23939: Sent initial data (152 bytes) 8975 1727204029.24438: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.24441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.24444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.24446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.24501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204029.24505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204029.24587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204029.26186: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204029.26252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204029.26323: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp0of5d06r /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py <<< 8975 1727204029.26330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py" <<< 8975 1727204029.26396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 8975 1727204029.26399: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp0of5d06r" to remote "/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py" <<< 8975 1727204029.27645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204029.27720: stderr chunk (state=3): >>><<< 8975 1727204029.27724: stdout chunk (state=3): >>><<< 8975 1727204029.27748: done transferring module to remote 8975 1727204029.27761: _low_level_execute_command(): starting 8975 1727204029.27767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/ /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py && sleep 0' 8975 1727204029.28235: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.28239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.28269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204029.28273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.28330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204029.28333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204029.28336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204029.28411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204029.30250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204029.30312: stderr chunk (state=3): >>><<< 8975 1727204029.30316: stdout chunk (state=3): >>><<< 8975 1727204029.30332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204029.30335: _low_level_execute_command(): starting 8975 1727204029.30340: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/AnsiballZ_setup.py && sleep 0' 8975 1727204029.30841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204029.30845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.30848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204029.30850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204029.30907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204029.30911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204029.30918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204029.30995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204029.33289: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 8975 1727204029.33299: stdout chunk (state=3): >>>import _imp # builtin <<< 8975 1727204029.33337: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 8975 1727204029.33407: stdout chunk (state=3): >>>import '_io' # <<< 8975 1727204029.33411: stdout chunk (state=3): >>>import 'marshal' # <<< 8975 1727204029.33448: stdout chunk (state=3): >>>import 'posix' # <<< 8975 1727204029.33483: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 8975 1727204029.33512: stdout chunk (state=3): >>>import 'time' # <<< 8975 1727204029.33515: stdout chunk (state=3): >>>import 'zipimport' # <<< 8975 1727204029.33518: stdout chunk (state=3): >>># installed zipimport hook <<< 8975 1727204029.33576: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.33594: stdout chunk (state=3): >>>import '_codecs' # <<< 8975 1727204029.33614: stdout chunk (state=3): >>>import 'codecs' # <<< 8975 1727204029.33661: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 8975 1727204029.33681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 8975 1727204029.33689: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6fa4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6f73b30> <<< 8975 1727204029.33721: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 8975 1727204029.33741: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6fa6ab0> <<< 8975 1727204029.33749: stdout chunk (state=3): >>>import '_signal' # <<< 8975 1727204029.33788: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 8975 1727204029.33794: stdout chunk (state=3): >>> <<< 8975 1727204029.33806: stdout chunk (state=3): >>>import 'io' # <<< 8975 1727204029.33841: stdout chunk (state=3): >>>import '_stat' # <<< 8975 1727204029.33844: stdout chunk (state=3): >>>import 'stat' # <<< 8975 1727204029.33936: stdout chunk (state=3): >>>import '_collections_abc' # <<< 8975 1727204029.33961: stdout chunk (state=3): >>>import 'genericpath' # <<< 8975 1727204029.33968: stdout chunk (state=3): >>>import 'posixpath' # <<< 8975 1727204029.33996: stdout chunk (state=3): >>>import 'os' # <<< 8975 1727204029.34017: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 8975 1727204029.34025: stdout chunk (state=3): >>>Processing user site-packages <<< 8975 1727204029.34028: stdout chunk (state=3): >>>Processing global site-packages <<< 8975 1727204029.34048: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 8975 1727204029.34058: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 8975 1727204029.34068: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 8975 1727204029.34097: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 8975 1727204029.34099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8975 1727204029.34117: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6d791c0> <<< 8975 1727204029.34183: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 8975 1727204029.34192: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.34205: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6d7a0c0> <<< 8975 1727204029.34228: stdout chunk (state=3): >>>import 'site' # <<< 8975 1727204029.34267: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8975 1727204029.34667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8975 1727204029.34684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8975 1727204029.34697: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8975 1727204029.34712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.34731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8975 1727204029.34780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8975 1727204029.34795: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8975 1727204029.34823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8975 1727204029.34836: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db7f20> <<< 8975 1727204029.34864: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8975 1727204029.34877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8975 1727204029.34901: stdout chunk (state=3): >>>import '_operator' # <<< 8975 1727204029.34904: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcc0b0> <<< 8975 1727204029.34928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8975 1727204029.34958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 8975 1727204029.34988: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8975 1727204029.35034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.35061: stdout chunk (state=3): >>>import 'itertools' # <<< 8975 1727204029.35088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6def8c0> <<< 8975 1727204029.35114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 8975 1727204029.35124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 8975 1727204029.35136: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6deff50> <<< 8975 1727204029.35144: stdout chunk (state=3): >>>import '_collections' # <<< 8975 1727204029.35207: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcfbc0> <<< 8975 1727204029.35211: stdout chunk (state=3): >>>import '_functools' # <<< 8975 1727204029.35247: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcd310> <<< 8975 1727204029.35344: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db50d0> <<< 8975 1727204029.35376: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8975 1727204029.35394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 8975 1727204029.35409: stdout chunk (state=3): >>>import '_sre' # <<< 8975 1727204029.35428: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 8975 1727204029.35458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 8975 1727204029.35482: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 8975 1727204029.35489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 8975 1727204029.35519: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e13890> <<< 8975 1727204029.35537: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e124b0> <<< 8975 1727204029.35576: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 8975 1727204029.35579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dce1e0> <<< 8975 1727204029.35582: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e10c50> <<< 8975 1727204029.35638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 8975 1727204029.35653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e448c0> <<< 8975 1727204029.35659: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db4350> <<< 8975 1727204029.35681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 8975 1727204029.35684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8975 1727204029.35706: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.35728: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.35734: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e44d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e44c20> <<< 8975 1727204029.35773: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.35776: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e45010> <<< 8975 1727204029.35786: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db2e70> <<< 8975 1727204029.35815: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 8975 1727204029.35819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.35847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8975 1727204029.35874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 8975 1727204029.35896: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e456d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e453a0> <<< 8975 1727204029.35909: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 8975 1727204029.35938: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 8975 1727204029.35947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 8975 1727204029.35956: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e465d0> <<< 8975 1727204029.35983: stdout chunk (state=3): >>>import 'importlib.util' # <<< 8975 1727204029.35986: stdout chunk (state=3): >>>import 'runpy' # <<< 8975 1727204029.36015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8975 1727204029.36046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 8975 1727204029.36079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 8975 1727204029.36087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5c800> <<< 8975 1727204029.36096: stdout chunk (state=3): >>>import 'errno' # <<< 8975 1727204029.36129: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36145: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36155: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5dee0> <<< 8975 1727204029.36162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8975 1727204029.36184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 8975 1727204029.36201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 8975 1727204029.36215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 8975 1727204029.36220: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5ed80> <<< 8975 1727204029.36268: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36277: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5f3e0> <<< 8975 1727204029.36286: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5e2d0> <<< 8975 1727204029.36304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 8975 1727204029.36312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8975 1727204029.36355: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36362: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5fdd0> <<< 8975 1727204029.36380: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5f500> <<< 8975 1727204029.36420: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e46630> <<< 8975 1727204029.36446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8975 1727204029.36469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8975 1727204029.36491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 8975 1727204029.36512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 8975 1727204029.36548: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36553: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6b9fce0> <<< 8975 1727204029.36576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 8975 1727204029.36611: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc8770> <<< 8975 1727204029.36616: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc84d0> <<< 8975 1727204029.36640: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36644: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc87a0> <<< 8975 1727204029.36673: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.36678: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc8980> <<< 8975 1727204029.36694: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6b9de80> <<< 8975 1727204029.36718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8975 1727204029.36823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8975 1727204029.36854: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8975 1727204029.36893: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc9fd0> <<< 8975 1727204029.36897: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc8c50> <<< 8975 1727204029.36926: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e46d20> <<< 8975 1727204029.36942: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8975 1727204029.36993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.37010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8975 1727204029.37059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8975 1727204029.37087: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bf6330> <<< 8975 1727204029.37143: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8975 1727204029.37153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.37179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 8975 1727204029.37200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8975 1727204029.37257: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0e4b0> <<< 8975 1727204029.37279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8975 1727204029.37322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8975 1727204029.37382: stdout chunk (state=3): >>>import 'ntpath' # <<< 8975 1727204029.37408: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c4b230> <<< 8975 1727204029.37429: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8975 1727204029.37469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8975 1727204029.37500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8975 1727204029.37537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8975 1727204029.37636: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c6d9d0> <<< 8975 1727204029.37712: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c4b350> <<< 8975 1727204029.37756: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0f140> <<< 8975 1727204029.37784: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a5c350> <<< 8975 1727204029.37807: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0d520> <<< 8975 1727204029.37813: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bcaf00> <<< 8975 1727204029.37981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8975 1727204029.38005: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6ea6c0d8b0> <<< 8975 1727204029.38183: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_278ll49g/ansible_ansible.legacy.setup_payload.zip' <<< 8975 1727204029.38190: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38338: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8975 1727204029.38373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8975 1727204029.38424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8975 1727204029.38500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8975 1727204029.38532: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6ac20f0> <<< 8975 1727204029.38544: stdout chunk (state=3): >>>import '_typing' # <<< 8975 1727204029.38744: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a98fe0> <<< 8975 1727204029.38748: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a981a0> <<< 8975 1727204029.38760: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38784: stdout chunk (state=3): >>>import 'ansible' # <<< 8975 1727204029.38795: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38819: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38824: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.38846: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 8975 1727204029.38867: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.40424: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.41730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a9bf80> <<< 8975 1727204029.41734: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.41771: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 8975 1727204029.41774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 8975 1727204029.41803: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8975 1727204029.41830: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.41836: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af5a30> <<< 8975 1727204029.41883: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af57c0> <<< 8975 1727204029.41915: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af50d0> <<< 8975 1727204029.41936: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 8975 1727204029.41943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8975 1727204029.41983: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af5520> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6ac2d80> <<< 8975 1727204029.41998: stdout chunk (state=3): >>>import 'atexit' # <<< 8975 1727204029.42024: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af6780> <<< 8975 1727204029.42057: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.42062: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af69c0> <<< 8975 1727204029.42086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8975 1727204029.42128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 8975 1727204029.42141: stdout chunk (state=3): >>>import '_locale' # <<< 8975 1727204029.42237: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af6f00> import 'pwd' # <<< 8975 1727204029.42250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 8975 1727204029.42346: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6958cb0> <<< 8975 1727204029.42372: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea695a8d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 8975 1727204029.42427: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695b1d0> <<< 8975 1727204029.42438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8975 1727204029.42537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695c3b0> <<< 8975 1727204029.42650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8975 1727204029.42949: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695eea0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea695f1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695d160> <<< 8975 1727204029.42953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6962e10> import '_tokenize' # <<< 8975 1727204029.42955: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69618e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6961640> <<< 8975 1727204029.42957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 8975 1727204029.42973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8975 1727204029.43030: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6963c50> <<< 8975 1727204029.43060: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695d670> <<< 8975 1727204029.43093: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69a6fc0> <<< 8975 1727204029.43122: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69a70e0> <<< 8975 1727204029.43229: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8975 1727204029.43246: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69acce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69acaa0> <<< 8975 1727204029.43308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8975 1727204029.43436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69af1d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69ad370> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8975 1727204029.43509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.43716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 8975 1727204029.43744: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b69f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69af380> <<< 8975 1727204029.43779: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7890> <<< 8975 1727204029.43822: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7a40> <<< 8975 1727204029.43953: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69a73e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8975 1727204029.44001: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.44021: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bb320> <<< 8975 1727204029.44219: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bc710> <<< 8975 1727204029.44290: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b9ac0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bae40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b9700> # zipimport: zlib available <<< 8975 1727204029.44294: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 8975 1727204029.44383: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.44588: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 8975 1727204029.44622: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 8975 1727204029.44693: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.44809: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.45789: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.46178: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea68447a0> <<< 8975 1727204029.46248: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 8975 1727204029.46287: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6845520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69bc8f0> <<< 8975 1727204029.46350: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.46360: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 8975 1727204029.46545: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.46833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 8975 1727204029.46836: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68455b0> # zipimport: zlib available <<< 8975 1727204029.47240: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.47750: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.47826: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.48018: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 8975 1727204029.48080: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.48173: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 8975 1727204029.48207: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.48472: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 8975 1727204029.48577: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.48833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8975 1727204029.49033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6847d10> # zipimport: zlib available <<< 8975 1727204029.49060: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.49147: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 8975 1727204029.49200: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8975 1727204029.49290: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.49581: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684e1b0> <<< 8975 1727204029.49585: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684ea80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68471a0> # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.49599: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 8975 1727204029.49750: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.49767: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.49814: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8975 1727204029.50013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684d7c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea684ec00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 8975 1727204029.50035: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.50145: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.50288: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.50407: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8975 1727204029.50562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e6cc0> <<< 8975 1727204029.50595: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68589e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6856ae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6856930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 8975 1727204029.50615: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.50635: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.50683: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 8975 1727204029.50781: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.50939: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 8975 1727204029.51001: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.51023: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51043: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.51089: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 8975 1727204029.51140: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51281: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51306: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51322: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51376: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 8975 1727204029.51673: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51695: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51763: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51805: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.51867: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.51891: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 8975 1727204029.51978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e9ac0> <<< 8975 1727204029.52013: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 8975 1727204029.52042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 8975 1727204029.52223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6328290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6328590> <<< 8975 1727204029.52255: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68c92e0> <<< 8975 1727204029.52268: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68c8410> <<< 8975 1727204029.52291: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e81a0> <<< 8975 1727204029.52372: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68ebd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 8975 1727204029.52396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 8975 1727204029.52423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 8975 1727204029.52751: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea632b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632ae10> <<< 8975 1727204029.52758: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea632afc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632b6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6396210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6394230> <<< 8975 1727204029.52781: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68ebc50> <<< 8975 1727204029.52813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 8975 1727204029.52838: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 8975 1727204029.52860: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.52924: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.53000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 8975 1727204029.53054: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.53124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 8975 1727204029.53178: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 8975 1727204029.53264: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.53287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.53353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 8975 1727204029.53368: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.53718: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.53779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 8975 1727204029.54337: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.54919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.54932: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.54954: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.54984: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 8975 1727204029.55016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 8975 1727204029.55072: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 8975 1727204029.55213: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.55373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 8975 1727204029.55878: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6396330> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8975 1727204029.55882: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6396f30> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 8975 1727204029.55973: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 8975 1727204029.56338: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.56393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 8975 1727204029.56428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 8975 1727204029.56557: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204029.56668: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea63c24e0> <<< 8975 1727204029.57008: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63aec90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 8975 1727204029.57012: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.57298: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.57484: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.57500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 8975 1727204029.57537: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.57874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 8975 1727204029.57878: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea5cfa060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63af500> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 8975 1727204029.57881: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 8975 1727204029.57974: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.58141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 8975 1727204029.58152: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.58261: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.58537: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.58541: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.58974: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 8975 1727204029.58986: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.59276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 8975 1727204029.59280: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.59282: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.59911: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.60563: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.60644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 8975 1727204029.60662: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.60758: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.60873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 8975 1727204029.61246: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.61301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 8975 1727204029.61304: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.61458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 8975 1727204029.61773: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.61777: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.61879: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.62035: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 8975 1727204029.62106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 8975 1727204029.62110: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.62220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 8975 1727204029.62324: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.62384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 8975 1727204029.62628: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.62692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 8975 1727204029.63029: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.63492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 8975 1727204029.63506: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 8975 1727204029.63541: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.63548: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.63711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 8975 1727204029.63843: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.63890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 8975 1727204029.63895: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.63899: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 8975 1727204029.63901: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.64134: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.64281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.64355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 8975 1727204029.64359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 8975 1727204029.64464: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.64683: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 8975 1727204029.64687: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.64901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 8975 1727204029.64905: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.65273: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.65603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.65608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204029.65610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8975 1727204029.65641: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204029.66297: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea5d22840> <<< 8975 1727204029.66304: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d23590> <<< 8975 1727204029.66383: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d1fe60> <<< 8975 1727204029.80105: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d685f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d69a60> <<< 8975 1727204029.80151: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 8975 1727204029.80177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204029.80386: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63c82c0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d6b980> <<< 8975 1727204029.80531: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 8975 1727204030.00813: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "49", "epoch": "1727204029", "epoch_int": "1727204029", "date": "2024-09-24", "time": "14:53:49", "iso8601_micro": "2024-09-24T18:53:49.670816Z", "iso8601": "2024-09-24T18:53:49Z", "iso8601_basic": "20240924T145349670816", "iso8601_basic_short": "20240924T145349", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root"<<< 8975 1727204030.00827: stdout chunk (state=3): >>>, "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.482421875, "5m": 0.390625, "15m": 0.18310546875}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3076, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 640, "free": 3076}, "nocache": {"free": 3499, "used": 217}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 376, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251327447040, "block_size": 4096, "block_total": 64479564, "block_available": 61359240, "block_used": 3120324, "inode_total": 16384000, "inode_available": 16301531, "inode_used": 82469, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8975 1727204030.01470: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8975 1727204030.01528: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token <<< 8975 1727204030.01574: stdout chunk (state=3): >>># destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 8975 1727204030.01634: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 8975 1727204030.01679: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 8975 1727204030.02014: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8975 1727204030.02039: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 8975 1727204030.02089: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 8975 1727204030.02124: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 8975 1727204030.02162: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 8975 1727204030.02202: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 8975 1727204030.02220: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 8975 1727204030.02273: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 8975 1727204030.02313: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 8975 1727204030.02348: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 8975 1727204030.02398: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 8975 1727204030.02445: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 8975 1727204030.02472: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 8975 1727204030.02506: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 8975 1727204030.02520: stdout chunk (state=3): >>># destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 8975 1727204030.02588: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 8975 1727204030.02611: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 8975 1727204030.02664: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 8975 1727204030.02694: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 8975 1727204030.02722: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 8975 1727204030.02743: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8975 1727204030.02893: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 8975 1727204030.02938: stdout chunk (state=3): >>># destroy _collections <<< 8975 1727204030.02942: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 8975 1727204030.03002: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 8975 1727204030.03005: stdout chunk (state=3): >>># destroy _typing <<< 8975 1727204030.03041: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8975 1727204030.03056: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8975 1727204030.03152: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 8975 1727204030.03328: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 8975 1727204030.03332: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8975 1727204030.03902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204030.03928: stderr chunk (state=3): >>><<< 8975 1727204030.03936: stdout chunk (state=3): >>><<< 8975 1727204030.04241: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6fa4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6f73b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6fa6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6d791c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6d7a0c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db7f20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcc0b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6def8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6deff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcfbc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dcd310> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db50d0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e13890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e124b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6dce1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e10c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e448c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db4350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e44d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e44c20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e45010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6db2e70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e456d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e453a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e465d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5c800> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5dee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5ed80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5f3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5e2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6e5fdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e5f500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e46630> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6b9fce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc8770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc84d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc87a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6bc8980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6b9de80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc9fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bc8c50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6e46d20> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bf6330> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0e4b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c4b230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c6d9d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c4b350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0f140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a5c350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6c0d520> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6bcaf00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6ea6c0d8b0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_278ll49g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6ac20f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a98fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a981a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6a9bf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af5a30> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af57c0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af50d0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af5520> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6ac2d80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af6780> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6af69c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6af6f00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6958cb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea695a8d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695b1d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695c3b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695eea0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea695f1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695d160> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6962e10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69618e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6961640> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6963c50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea695d670> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69a6fc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69a70e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69acce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69acaa0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69af1d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69ad370> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b69f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69af380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69b7b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69a73e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bb320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bc710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b9ac0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea69bae40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69b9700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea68447a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6845520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea69bc8f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68455b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6847d10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684e1b0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684ea80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68471a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea684d7c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea684ec00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e6cc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68589e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6856ae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6856930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e9ac0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6328290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6328590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68c92e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68c8410> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68e81a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68ebd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea632b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632ae10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea632afc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea632b6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea6396210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6394230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea68ebc50> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6396330> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea6396f30> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea63c24e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63aec90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea5cfa060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63af500> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6ea5d22840> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d23590> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d1fe60> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d685f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d69a60> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea63c82c0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6ea5d6b980> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "49", "epoch": "1727204029", "epoch_int": "1727204029", "date": "2024-09-24", "time": "14:53:49", "iso8601_micro": "2024-09-24T18:53:49.670816Z", "iso8601": "2024-09-24T18:53:49Z", "iso8601_basic": "20240924T145349670816", "iso8601_basic_short": "20240924T145349", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.482421875, "5m": 0.390625, "15m": 0.18310546875}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3076, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 640, "free": 3076}, "nocache": {"free": 3499, "used": 217}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 376, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251327447040, "block_size": 4096, "block_total": 64479564, "block_available": 61359240, "block_used": 3120324, "inode_total": 16384000, "inode_available": 16301531, "inode_used": 82469, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 8975 1727204030.07885: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204030.07919: _low_level_execute_command(): starting 8975 1727204030.07925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204028.7318208-9045-244655295830379/ > /dev/null 2>&1 && sleep 0' 8975 1727204030.08674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204030.08679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204030.08682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204030.08772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204030.08776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.08821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204030.08874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.08937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.11048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.11297: stderr chunk (state=3): >>><<< 8975 1727204030.11301: stdout chunk (state=3): >>><<< 8975 1727204030.11304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204030.11306: handler run complete 8975 1727204030.11573: variable 'ansible_facts' from source: unknown 8975 1727204030.11731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.12516: variable 'ansible_facts' from source: unknown 8975 1727204030.12730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.12968: attempt loop complete, returning result 8975 1727204030.12982: _execute() done 8975 1727204030.12990: dumping result to json 8975 1727204030.13022: done dumping result, returning 8975 1727204030.13038: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-9356-306d-0000000000cd] 8975 1727204030.13047: sending task result for task 127b8e07-fff9-9356-306d-0000000000cd ok: [managed-node2] 8975 1727204030.14318: no more pending results, returning what we have 8975 1727204030.14324: results queue empty 8975 1727204030.14325: checking for any_errors_fatal 8975 1727204030.14327: done checking for any_errors_fatal 8975 1727204030.14328: checking for max_fail_percentage 8975 1727204030.14329: done checking for max_fail_percentage 8975 1727204030.14330: checking to see if all hosts have failed and the running result is not ok 8975 1727204030.14331: done checking to see if all hosts have failed 8975 1727204030.14332: getting the remaining hosts for this loop 8975 1727204030.14507: done getting the remaining hosts for this loop 8975 1727204030.14513: getting the next task for host managed-node2 8975 1727204030.14521: done getting next task for host managed-node2 8975 1727204030.14523: ^ task is: TASK: meta (flush_handlers) 8975 1727204030.14525: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204030.14530: getting variables 8975 1727204030.14531: in VariableManager get_vars() 8975 1727204030.14558: Calling all_inventory to load vars for managed-node2 8975 1727204030.14561: Calling groups_inventory to load vars for managed-node2 8975 1727204030.14642: done sending task result for task 127b8e07-fff9-9356-306d-0000000000cd 8975 1727204030.14647: WORKER PROCESS EXITING 8975 1727204030.14649: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.14661: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.14664: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.14670: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.15240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.15656: done with get_vars() 8975 1727204030.15678: done getting variables 8975 1727204030.15751: in VariableManager get_vars() 8975 1727204030.15775: Calling all_inventory to load vars for managed-node2 8975 1727204030.15778: Calling groups_inventory to load vars for managed-node2 8975 1727204030.15781: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.15786: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.15789: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.15791: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.16041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.16503: done with get_vars() 8975 1727204030.16523: done queuing things up, now waiting for results queue to drain 8975 1727204030.16526: results queue empty 8975 1727204030.16527: checking for any_errors_fatal 8975 1727204030.16530: done checking for any_errors_fatal 8975 1727204030.16531: checking for max_fail_percentage 8975 1727204030.16532: done checking for max_fail_percentage 8975 1727204030.16538: checking to see if all hosts have failed and the running result is not ok 8975 1727204030.16539: done checking to see if all hosts have failed 8975 1727204030.16539: getting the remaining hosts for this loop 8975 1727204030.16540: done getting the remaining hosts for this loop 8975 1727204030.16544: getting the next task for host managed-node2 8975 1727204030.16550: done getting next task for host managed-node2 8975 1727204030.16553: ^ task is: TASK: Include the task 'el_repo_setup.yml' 8975 1727204030.16555: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204030.16557: getting variables 8975 1727204030.16558: in VariableManager get_vars() 8975 1727204030.16571: Calling all_inventory to load vars for managed-node2 8975 1727204030.16573: Calling groups_inventory to load vars for managed-node2 8975 1727204030.16576: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.16582: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.16585: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.16587: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.16786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.17212: done with get_vars() 8975 1727204030.17222: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Tuesday 24 September 2024 14:53:50 -0400 (0:00:01.482) 0:00:01.490 ***** 8975 1727204030.17382: entering _queue_task() for managed-node2/include_tasks 8975 1727204030.17384: Creating lock for include_tasks 8975 1727204030.18057: worker is 1 (out of 1 available) 8975 1727204030.18080: exiting _queue_task() for managed-node2/include_tasks 8975 1727204030.18092: done queuing things up, now waiting for results queue to drain 8975 1727204030.18093: waiting for pending results... 8975 1727204030.18296: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 8975 1727204030.18401: in run() - task 127b8e07-fff9-9356-306d-000000000006 8975 1727204030.18572: variable 'ansible_search_path' from source: unknown 8975 1727204030.18576: calling self._execute() 8975 1727204030.18579: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204030.18582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204030.18584: variable 'omit' from source: magic vars 8975 1727204030.18691: _execute() done 8975 1727204030.18810: dumping result to json 8975 1727204030.18814: done dumping result, returning 8975 1727204030.18816: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-9356-306d-000000000006] 8975 1727204030.18819: sending task result for task 127b8e07-fff9-9356-306d-000000000006 8975 1727204030.19273: done sending task result for task 127b8e07-fff9-9356-306d-000000000006 8975 1727204030.19277: WORKER PROCESS EXITING 8975 1727204030.19418: no more pending results, returning what we have 8975 1727204030.19423: in VariableManager get_vars() 8975 1727204030.19462: Calling all_inventory to load vars for managed-node2 8975 1727204030.19474: Calling groups_inventory to load vars for managed-node2 8975 1727204030.19478: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.19493: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.19496: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.19500: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.19803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.20328: done with get_vars() 8975 1727204030.20338: variable 'ansible_search_path' from source: unknown 8975 1727204030.20355: we have included files to process 8975 1727204030.20356: generating all_blocks data 8975 1727204030.20358: done generating all_blocks data 8975 1727204030.20358: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8975 1727204030.20360: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8975 1727204030.20573: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 8975 1727204030.21557: in VariableManager get_vars() 8975 1727204030.21580: done with get_vars() 8975 1727204030.21594: done processing included file 8975 1727204030.21596: iterating over new_blocks loaded from include file 8975 1727204030.21598: in VariableManager get_vars() 8975 1727204030.21775: done with get_vars() 8975 1727204030.21777: filtering new block on tags 8975 1727204030.21795: done filtering new block on tags 8975 1727204030.21799: in VariableManager get_vars() 8975 1727204030.21810: done with get_vars() 8975 1727204030.21812: filtering new block on tags 8975 1727204030.21831: done filtering new block on tags 8975 1727204030.21834: in VariableManager get_vars() 8975 1727204030.21846: done with get_vars() 8975 1727204030.21847: filtering new block on tags 8975 1727204030.21860: done filtering new block on tags 8975 1727204030.21862: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 8975 1727204030.21873: extending task lists for all hosts with included blocks 8975 1727204030.21925: done extending task lists 8975 1727204030.21927: done processing included files 8975 1727204030.21928: results queue empty 8975 1727204030.21929: checking for any_errors_fatal 8975 1727204030.21930: done checking for any_errors_fatal 8975 1727204030.21931: checking for max_fail_percentage 8975 1727204030.21932: done checking for max_fail_percentage 8975 1727204030.21933: checking to see if all hosts have failed and the running result is not ok 8975 1727204030.21934: done checking to see if all hosts have failed 8975 1727204030.21934: getting the remaining hosts for this loop 8975 1727204030.21936: done getting the remaining hosts for this loop 8975 1727204030.21939: getting the next task for host managed-node2 8975 1727204030.21943: done getting next task for host managed-node2 8975 1727204030.21945: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 8975 1727204030.21947: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204030.21949: getting variables 8975 1727204030.21950: in VariableManager get_vars() 8975 1727204030.21959: Calling all_inventory to load vars for managed-node2 8975 1727204030.21961: Calling groups_inventory to load vars for managed-node2 8975 1727204030.21964: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.21973: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.21975: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.21979: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.22276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.22471: done with get_vars() 8975 1727204030.22516: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:50 -0400 (0:00:00.052) 0:00:01.542 ***** 8975 1727204030.22605: entering _queue_task() for managed-node2/setup 8975 1727204030.23413: worker is 1 (out of 1 available) 8975 1727204030.23425: exiting _queue_task() for managed-node2/setup 8975 1727204030.23440: done queuing things up, now waiting for results queue to drain 8975 1727204030.23441: waiting for pending results... 8975 1727204030.23688: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 8975 1727204030.23904: in run() - task 127b8e07-fff9-9356-306d-0000000000de 8975 1727204030.23931: variable 'ansible_search_path' from source: unknown 8975 1727204030.23939: variable 'ansible_search_path' from source: unknown 8975 1727204030.23985: calling self._execute() 8975 1727204030.24129: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204030.24136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204030.24139: variable 'omit' from source: magic vars 8975 1727204030.24740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204030.27832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204030.27916: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204030.27971: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204030.28039: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204030.28061: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204030.28258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204030.28262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204030.28264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204030.28301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204030.28327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204030.28541: variable 'ansible_facts' from source: unknown 8975 1727204030.28642: variable 'network_test_required_facts' from source: task vars 8975 1727204030.28702: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 8975 1727204030.28714: variable 'omit' from source: magic vars 8975 1727204030.28766: variable 'omit' from source: magic vars 8975 1727204030.28826: variable 'omit' from source: magic vars 8975 1727204030.28861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204030.28899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204030.28933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204030.28957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204030.29019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204030.29025: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204030.29028: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204030.29030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204030.29151: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204030.29159: Set connection var ansible_connection to ssh 8975 1727204030.29171: Set connection var ansible_shell_executable to /bin/sh 8975 1727204030.29182: Set connection var ansible_timeout to 10 8975 1727204030.29189: Set connection var ansible_shell_type to sh 8975 1727204030.29239: Set connection var ansible_pipelining to False 8975 1727204030.29244: variable 'ansible_shell_executable' from source: unknown 8975 1727204030.29251: variable 'ansible_connection' from source: unknown 8975 1727204030.29258: variable 'ansible_module_compression' from source: unknown 8975 1727204030.29265: variable 'ansible_shell_type' from source: unknown 8975 1727204030.29273: variable 'ansible_shell_executable' from source: unknown 8975 1727204030.29280: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204030.29287: variable 'ansible_pipelining' from source: unknown 8975 1727204030.29348: variable 'ansible_timeout' from source: unknown 8975 1727204030.29352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204030.29487: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204030.29505: variable 'omit' from source: magic vars 8975 1727204030.29514: starting attempt loop 8975 1727204030.29523: running the handler 8975 1727204030.29540: _low_level_execute_command(): starting 8975 1727204030.29552: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204030.30381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204030.30453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.30519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204030.30558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.30625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.30999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.32756: stdout chunk (state=3): >>>/root <<< 8975 1727204030.33056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.33060: stderr chunk (state=3): >>><<< 8975 1727204030.33063: stdout chunk (state=3): >>><<< 8975 1727204030.33068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204030.33080: _low_level_execute_command(): starting 8975 1727204030.33083: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062 `" && echo ansible-tmp-1727204030.330114-9445-134253296449062="` echo /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062 `" ) && sleep 0' 8975 1727204030.34667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204030.34972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.34994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.35092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.37152: stdout chunk (state=3): >>>ansible-tmp-1727204030.330114-9445-134253296449062=/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062 <<< 8975 1727204030.37367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.37372: stderr chunk (state=3): >>><<< 8975 1727204030.37375: stdout chunk (state=3): >>><<< 8975 1727204030.37572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204030.330114-9445-134253296449062=/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204030.37577: variable 'ansible_module_compression' from source: unknown 8975 1727204030.37599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8975 1727204030.37661: variable 'ansible_facts' from source: unknown 8975 1727204030.38076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py 8975 1727204030.38173: Sending initial data 8975 1727204030.38178: Sent initial data (151 bytes) 8975 1727204030.38747: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204030.38756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204030.38769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204030.38884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.38892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.38992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.40619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204030.40701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204030.40790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp56_erp82 /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py <<< 8975 1727204030.40799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py" <<< 8975 1727204030.41100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp56_erp82" to remote "/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py" <<< 8975 1727204030.43703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.44157: stderr chunk (state=3): >>><<< 8975 1727204030.44161: stdout chunk (state=3): >>><<< 8975 1727204030.44163: done transferring module to remote 8975 1727204030.44169: _low_level_execute_command(): starting 8975 1727204030.44171: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/ /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py && sleep 0' 8975 1727204030.45379: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204030.45383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204030.45396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204030.45595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.45669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204030.45673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.45676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.45758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.47898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.47902: stdout chunk (state=3): >>><<< 8975 1727204030.47905: stderr chunk (state=3): >>><<< 8975 1727204030.47908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204030.47910: _low_level_execute_command(): starting 8975 1727204030.47913: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/AnsiballZ_setup.py && sleep 0' 8975 1727204030.49475: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.49547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204030.49571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.49589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.49801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.52119: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 8975 1727204030.52142: stdout chunk (state=3): >>>import _imp # builtin <<< 8975 1727204030.52171: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 8975 1727204030.52179: stdout chunk (state=3): >>>import '_weakref' # <<< 8975 1727204030.52506: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 8975 1727204030.52512: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 8975 1727204030.52540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 8975 1727204030.52601: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73fc530> <<< 8975 1727204030.52604: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73cbb30> <<< 8975 1727204030.52625: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 8975 1727204030.52628: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73feab0> <<< 8975 1727204030.52633: stdout chunk (state=3): >>>import '_signal' # <<< 8975 1727204030.52716: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 8975 1727204030.52720: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 8975 1727204030.52817: stdout chunk (state=3): >>>import '_collections_abc' # <<< 8975 1727204030.52932: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 8975 1727204030.52952: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 8975 1727204030.52978: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 8975 1727204030.52999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8975 1727204030.53119: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a71d11c0> <<< 8975 1727204030.53141: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a71d20c0> import 'site' # <<< 8975 1727204030.53160: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8975 1727204030.53579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8975 1727204030.53614: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8975 1727204030.53618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.53670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8975 1727204030.53701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8975 1727204030.53725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8975 1727204030.53799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720ffe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8975 1727204030.53822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8975 1727204030.53833: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7224170> <<< 8975 1727204030.53909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8975 1727204030.53912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8975 1727204030.53962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.54112: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72479b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 8975 1727204030.54118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 8975 1727204030.54137: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7247f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7227c50> import '_functools' # <<< 8975 1727204030.54164: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72253d0> <<< 8975 1727204030.54273: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720d190> <<< 8975 1727204030.54292: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8975 1727204030.54588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 8975 1727204030.54592: stdout chunk (state=3): >>>import '_sre' # <<< 8975 1727204030.54629: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a726b980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a726a5a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7268d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7298a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720c410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7298ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7298d70> <<< 8975 1727204030.54661: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.54682: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7299160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720af30> <<< 8975 1727204030.54711: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 8975 1727204030.54737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8975 1727204030.54769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 8975 1727204030.54800: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72997f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72994f0> <<< 8975 1727204030.54817: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 8975 1727204030.54853: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729a6f0> <<< 8975 1727204030.54883: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 8975 1727204030.54911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8975 1727204030.54948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 8975 1727204030.54982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 8975 1727204030.55000: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b48f0> import 'errno' # <<< 8975 1727204030.55038: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.55269: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b6030> <<< 8975 1727204030.55275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8975 1727204030.55304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b6ed0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b6420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b7ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b75f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729a660> <<< 8975 1727204030.55325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8975 1727204030.55351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8975 1727204030.55374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 8975 1727204030.55395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 8975 1727204030.55518: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6ffbd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7024590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024860> <<< 8975 1727204030.55550: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024a40> <<< 8975 1727204030.55570: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ff9ee0> <<< 8975 1727204030.55597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8975 1727204030.55697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8975 1727204030.55729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 8975 1727204030.55750: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70260f0> <<< 8975 1727204030.55768: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7024d70> <<< 8975 1727204030.55833: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729ae10> <<< 8975 1727204030.55848: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8975 1727204030.55876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.55893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8975 1727204030.55945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8975 1727204030.55970: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a704e4b0> <<< 8975 1727204030.56028: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8975 1727204030.56053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.56131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 8975 1727204030.56135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8975 1727204030.56162: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a706a5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8975 1727204030.56301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70a3350> <<< 8975 1727204030.56305: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8975 1727204030.56350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8975 1727204030.56374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8975 1727204030.56420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8975 1727204030.56513: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70c9af0> <<< 8975 1727204030.56609: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70a3470> <<< 8975 1727204030.56631: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a706b200> <<< 8975 1727204030.56663: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 8975 1727204030.56715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ea44a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70695e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7027050> <<< 8975 1727204030.56862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8975 1727204030.56929: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f82a6ea4740> <<< 8975 1727204030.57062: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_6rmrvo7c/ansible_setup_payload.zip' <<< 8975 1727204030.57083: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.57223: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.57263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8975 1727204030.57299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8975 1727204030.57462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8975 1727204030.57471: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 8975 1727204030.57475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f122a0> import '_typing' # <<< 8975 1727204030.57632: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ee9190> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ee82f0> <<< 8975 1727204030.57717: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.57732: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 8975 1727204030.57754: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.59329: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.60618: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6eeb290> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 8975 1727204030.60886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f41cd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41a60> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41370> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f12cc0> import 'atexit' # <<< 8975 1727204030.60893: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f42990> <<< 8975 1727204030.60922: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f42b10> <<< 8975 1727204030.60942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 8975 1727204030.60992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 8975 1727204030.61010: stdout chunk (state=3): >>>import '_locale' # <<< 8975 1727204030.61043: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f42f60> <<< 8975 1727204030.61070: stdout chunk (state=3): >>>import 'pwd' # <<< 8975 1727204030.61089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 8975 1727204030.61108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 8975 1727204030.61141: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6da8d70> <<< 8975 1727204030.61176: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.61207: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6daa990> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 8975 1727204030.61218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 8975 1727204030.61260: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dab2c0> <<< 8975 1727204030.61279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 8975 1727204030.61341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 8975 1727204030.61344: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dac440> <<< 8975 1727204030.61347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8975 1727204030.61383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8975 1727204030.61404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 8975 1727204030.61420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8975 1727204030.61594: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6daef00> <<< 8975 1727204030.61598: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6daf290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dad1f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 8975 1727204030.61616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 8975 1727204030.61633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8975 1727204030.61658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 8975 1727204030.61692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 8975 1727204030.61758: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db2f90> import '_tokenize' # <<< 8975 1727204030.61797: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db1a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db17c0> <<< 8975 1727204030.61971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8975 1727204030.62006: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db3fe0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dad700> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6df7050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6df7260> <<< 8975 1727204030.62036: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 8975 1727204030.62067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8975 1727204030.62122: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6dfcda0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dfcb60> <<< 8975 1727204030.62136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8975 1727204030.62235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8975 1727204030.62290: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6dff260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dfd490> <<< 8975 1727204030.62313: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8975 1727204030.62359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.62396: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 8975 1727204030.62449: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e06a80> <<< 8975 1727204030.62703: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dff410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e078c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e07ad0> <<< 8975 1727204030.62739: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e07bc0> <<< 8975 1727204030.62760: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6df7440> <<< 8975 1727204030.62788: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8975 1727204030.62807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 8975 1727204030.62830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8975 1727204030.62855: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.62886: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0b4a0> <<< 8975 1727204030.63052: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.63075: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0c8f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e09c40> <<< 8975 1727204030.63106: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0aff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e09850> <<< 8975 1727204030.63157: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 8975 1727204030.63232: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.63269: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.63384: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 8975 1727204030.63406: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.63553: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 8975 1727204030.63700: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.64313: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.64977: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.65027: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c94950> <<< 8975 1727204030.65623: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c95760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e0ca70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.65627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c95520> # zipimport: zlib available <<< 8975 1727204030.66141: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.66643: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.66722: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.66799: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8975 1727204030.66892: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 8975 1727204030.66952: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.66982: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.67198: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 8975 1727204030.67220: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.67560: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.67729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8975 1727204030.67791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 8975 1727204030.67814: stdout chunk (state=3): >>>import '_ast' # <<< 8975 1727204030.67870: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c966f0> <<< 8975 1727204030.67890: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.67964: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.68049: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 8975 1727204030.68071: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 8975 1727204030.68095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8975 1727204030.68179: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.68309: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9e330> <<< 8975 1727204030.68572: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9ec90> <<< 8975 1727204030.68576: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c96ff0> <<< 8975 1727204030.68594: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.68650: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.68721: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8975 1727204030.68759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204030.68894: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9dac0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c9ee10> <<< 8975 1727204030.69014: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.69073: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.69513: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d32de0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ca8b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ca6cf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c9d8e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 8975 1727204030.69557: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.69581: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 8975 1727204030.69640: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 8975 1727204030.69667: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.69815: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 8975 1727204030.69840: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.69911: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.69960: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.69994: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 8975 1727204030.70052: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70312: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70326: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 8975 1727204030.70475: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70681: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70710: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.70772: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 8975 1727204030.70796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 8975 1727204030.71120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d35be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bc350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bc680> <<< 8975 1727204030.71232: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d153d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d14290> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d342c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d37d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 8975 1727204030.71257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 8975 1727204030.71285: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 8975 1727204030.71312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 8975 1727204030.71328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 8975 1727204030.71353: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bf590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bee40> <<< 8975 1727204030.71389: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bf020> <<< 8975 1727204030.71574: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62be270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 8975 1727204030.71622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bf710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a63261e0> <<< 8975 1727204030.71656: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6324230> <<< 8975 1727204030.71787: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d34ce0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 8975 1727204030.71839: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.71887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 8975 1727204030.71900: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.72029: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 8975 1727204030.72054: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 8975 1727204030.72214: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.72251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 8975 1727204030.72255: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.72298: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.72346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 8975 1727204030.72360: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.72420: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.72612: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.72616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 8975 1727204030.72632: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.73173: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.73783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.73811: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.73853: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 8975 1727204030.74102: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.74105: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.74135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 8975 1727204030.74152: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.74184: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.74212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 8975 1727204030.74225: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.74456: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6326300> <<< 8975 1727204030.74460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 8975 1727204030.74490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 8975 1727204030.74899: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6326fc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.74992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 8975 1727204030.74996: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.75056: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.75138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 8975 1727204030.75158: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.75193: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.75371: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204030.75434: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a635a570> <<< 8975 1727204030.75694: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6342d20> import 'ansible.module_utils.facts.system.python' # <<< 8975 1727204030.75762: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.75789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 8975 1727204030.75988: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.76097: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.76384: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 8975 1727204030.76388: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.76552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a614ddf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a614de50> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 8975 1727204030.76571: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 8975 1727204030.76619: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.76636: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.76873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 8975 1727204030.76877: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.76879: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.77092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 8975 1727204030.77141: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.77555: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.77580: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.77724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 8975 1727204030.77787: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.77873: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.78191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.78715: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.79523: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.79544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 8975 1727204030.79633: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.79745: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 8975 1727204030.79922: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 8975 1727204030.80124: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 8975 1727204030.80141: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80191: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 8975 1727204030.80243: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80349: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80456: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80684: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.80913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 8975 1727204030.81087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 8975 1727204030.81154: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 8975 1727204030.81245: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81267: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 8975 1727204030.81309: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81369: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 8975 1727204030.81507: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 8975 1727204030.81582: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.81872: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 8975 1727204030.82186: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82230: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 8975 1727204030.82515: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 8975 1727204030.82557: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82642: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 8975 1727204030.82762: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 8975 1727204030.82783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 8975 1727204030.82831: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82878: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 8975 1727204030.82896: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82928: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82932: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.82987: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83040: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83113: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 8975 1727204030.83214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 8975 1727204030.83488: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 8975 1727204030.83559: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 8975 1727204030.83793: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83836: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83889: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 8975 1727204030.83902: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.83946: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.84000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 8975 1727204030.84018: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.84100: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.84197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 8975 1727204030.84212: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.84307: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.84409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 8975 1727204030.84490: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204030.85468: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 8975 1727204030.85502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 8975 1727204030.85695: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6177d10> <<< 8975 1727204030.85711: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6175580> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6175700> <<< 8975 1727204030.85991: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "50", "epoch": "1727204030", "epoch_int": "1727204030", "date": "2024-09-24", "time": "14:53:50", "iso8601_micro": "2024-09-24T18:53:50.857527Z", "iso8601": "2024-09-24T18:53:50Z", "iso8601_basic": "20240924T145350857527", "iso8601_basic_short": "20240924T145350", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8975 1727204030.86518: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8975 1727204030.86584: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 8975 1727204030.86791: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 8975 1727204030.86834: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time <<< 8975 1727204030.86877: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 8975 1727204030.87205: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8975 1727204030.87237: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 8975 1727204030.87267: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 8975 1727204030.87288: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 8975 1727204030.87319: stdout chunk (state=3): >>># destroy ntpath <<< 8975 1727204030.87346: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 8975 1727204030.87392: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 8975 1727204030.87477: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 8975 1727204030.87488: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 8975 1727204030.87519: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 8975 1727204030.87563: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue <<< 8975 1727204030.87810: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 8975 1727204030.87814: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 8975 1727204030.87867: stdout chunk (state=3): >>># destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 8975 1727204030.87899: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 8975 1727204030.87934: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 8975 1727204030.87945: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 8975 1727204030.87972: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8975 1727204030.88107: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 8975 1727204030.88133: stdout chunk (state=3): >>># destroy _collections <<< 8975 1727204030.88164: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 8975 1727204030.88196: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 8975 1727204030.88252: stdout chunk (state=3): >>># destroy _typing <<< 8975 1727204030.88256: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 8975 1727204030.88276: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8975 1727204030.88419: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8975 1727204030.88423: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 8975 1727204030.88427: stdout chunk (state=3): >>># destroy time <<< 8975 1727204030.88451: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 8975 1727204030.88478: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools <<< 8975 1727204030.88670: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8975 1727204030.89194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204030.89208: stdout chunk (state=3): >>><<< 8975 1727204030.89220: stderr chunk (state=3): >>><<< 8975 1727204030.89785: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a73feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a71d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a71d20c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720ffe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7224170> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72479b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7247f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7227c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72253d0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720d190> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a726b980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a726a5a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7268d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7298a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720c410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7298ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7298d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7299160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a720af30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72997f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72994f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729a6f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b48f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b6030> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b6ed0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b6420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a72b7ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a72b75f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729a660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6ffbd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7024590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a7024a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ff9ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70260f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7024d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a729ae10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a704e4b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a706a5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70a3350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70c9af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70a3470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a706b200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ea44a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a70695e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a7027050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f82a6ea4740> # zipimport: found 103 names in '/tmp/ansible_setup_payload_6rmrvo7c/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f122a0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ee9190> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ee82f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6eeb290> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f41cd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41a60> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41370> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f41d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f12cc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f42990> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6f42b10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6f42f60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6da8d70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6daa990> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dab2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dac440> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6daef00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6daf290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dad1f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db2f90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db1a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db17c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6db3fe0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dad700> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6df7050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6df7260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6dfcda0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dfcb60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6dff260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dfd490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e06a80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6dff410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e078c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e07ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e07bc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6df7440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0b4a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0c8f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e09c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6e0aff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e09850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c94950> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c95760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6e0ca70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c95520> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c966f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9e330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9ec90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c96ff0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6c9dac0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c9ee10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d32de0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ca8b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6ca6cf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6c9d8e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d35be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bc350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bc680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d153d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d14290> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d342c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d37d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bf590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bee40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a62bf020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62be270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a62bf710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a63261e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6324230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6d34ce0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6326300> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6326fc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a635a570> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6342d20> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a614ddf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a614de50> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f82a6177d10> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6175580> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f82a6175700> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "50", "epoch": "1727204030", "epoch_int": "1727204030", "date": "2024-09-24", "time": "14:53:50", "iso8601_micro": "2024-09-24T18:53:50.857527Z", "iso8601": "2024-09-24T18:53:50Z", "iso8601_basic": "20240924T145350857527", "iso8601_basic_short": "20240924T145350", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8975 1727204030.91351: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204030.91355: _low_level_execute_command(): starting 8975 1727204030.91359: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204030.330114-9445-134253296449062/ > /dev/null 2>&1 && sleep 0' 8975 1727204030.91873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204030.91878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204030.91881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204030.91884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204030.91887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204030.91889: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204030.91892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.91894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204030.91896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204030.91899: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204030.91918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204030.91926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204030.91942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204030.91947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204030.92067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204030.94258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204030.94263: stdout chunk (state=3): >>><<< 8975 1727204030.94269: stderr chunk (state=3): >>><<< 8975 1727204030.94289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204030.94295: handler run complete 8975 1727204030.94352: variable 'ansible_facts' from source: unknown 8975 1727204030.94414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.95013: variable 'ansible_facts' from source: unknown 8975 1727204030.95335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.95398: attempt loop complete, returning result 8975 1727204030.95402: _execute() done 8975 1727204030.95405: dumping result to json 8975 1727204030.95642: done dumping result, returning 8975 1727204030.95652: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-9356-306d-0000000000de] 8975 1727204030.95658: sending task result for task 127b8e07-fff9-9356-306d-0000000000de ok: [managed-node2] 8975 1727204030.95957: no more pending results, returning what we have 8975 1727204030.95961: results queue empty 8975 1727204030.95962: checking for any_errors_fatal 8975 1727204030.95963: done checking for any_errors_fatal 8975 1727204030.95964: checking for max_fail_percentage 8975 1727204030.95968: done checking for max_fail_percentage 8975 1727204030.95969: checking to see if all hosts have failed and the running result is not ok 8975 1727204030.95970: done checking to see if all hosts have failed 8975 1727204030.95971: getting the remaining hosts for this loop 8975 1727204030.95972: done getting the remaining hosts for this loop 8975 1727204030.95977: getting the next task for host managed-node2 8975 1727204030.95988: done getting next task for host managed-node2 8975 1727204030.95991: ^ task is: TASK: Check if system is ostree 8975 1727204030.95993: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204030.95997: getting variables 8975 1727204030.95999: in VariableManager get_vars() 8975 1727204030.96032: Calling all_inventory to load vars for managed-node2 8975 1727204030.96035: Calling groups_inventory to load vars for managed-node2 8975 1727204030.96038: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204030.96051: Calling all_plugins_play to load vars for managed-node2 8975 1727204030.96054: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204030.96057: Calling groups_plugins_play to load vars for managed-node2 8975 1727204030.97419: done sending task result for task 127b8e07-fff9-9356-306d-0000000000de 8975 1727204030.97427: WORKER PROCESS EXITING 8975 1727204030.97437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204030.97747: done with get_vars() 8975 1727204030.97761: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:53:50 -0400 (0:00:00.753) 0:00:02.296 ***** 8975 1727204030.97986: entering _queue_task() for managed-node2/stat 8975 1727204030.98874: worker is 1 (out of 1 available) 8975 1727204030.98890: exiting _queue_task() for managed-node2/stat 8975 1727204030.98905: done queuing things up, now waiting for results queue to drain 8975 1727204030.98906: waiting for pending results... 8975 1727204030.99385: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 8975 1727204030.99569: in run() - task 127b8e07-fff9-9356-306d-0000000000e0 8975 1727204030.99655: variable 'ansible_search_path' from source: unknown 8975 1727204030.99663: variable 'ansible_search_path' from source: unknown 8975 1727204030.99787: calling self._execute() 8975 1727204030.99962: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.00293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.00298: variable 'omit' from source: magic vars 8975 1727204031.01254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204031.01915: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204031.02073: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204031.02365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204031.02372: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204031.02425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204031.02614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204031.02648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204031.02686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204031.03128: Evaluated conditional (not __network_is_ostree is defined): True 8975 1727204031.03132: variable 'omit' from source: magic vars 8975 1727204031.03135: variable 'omit' from source: magic vars 8975 1727204031.03193: variable 'omit' from source: magic vars 8975 1727204031.03269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204031.03377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204031.03400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204031.03421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.03465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.03502: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204031.03568: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.03576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.03789: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204031.03797: Set connection var ansible_connection to ssh 8975 1727204031.03807: Set connection var ansible_shell_executable to /bin/sh 8975 1727204031.03818: Set connection var ansible_timeout to 10 8975 1727204031.03992: Set connection var ansible_shell_type to sh 8975 1727204031.03995: Set connection var ansible_pipelining to False 8975 1727204031.03998: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.04001: variable 'ansible_connection' from source: unknown 8975 1727204031.04005: variable 'ansible_module_compression' from source: unknown 8975 1727204031.04007: variable 'ansible_shell_type' from source: unknown 8975 1727204031.04009: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.04011: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.04013: variable 'ansible_pipelining' from source: unknown 8975 1727204031.04015: variable 'ansible_timeout' from source: unknown 8975 1727204031.04017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.04355: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204031.04443: variable 'omit' from source: magic vars 8975 1727204031.04454: starting attempt loop 8975 1727204031.04461: running the handler 8975 1727204031.04645: _low_level_execute_command(): starting 8975 1727204031.04649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204031.05905: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204031.06300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.06304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.08015: stdout chunk (state=3): >>>/root <<< 8975 1727204031.08290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.08374: stderr chunk (state=3): >>><<< 8975 1727204031.08379: stdout chunk (state=3): >>><<< 8975 1727204031.08405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204031.08639: _low_level_execute_command(): starting 8975 1727204031.08643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897 `" && echo ansible-tmp-1727204031.0842266-9561-80538599227897="` echo /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897 `" ) && sleep 0' 8975 1727204031.09409: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204031.09428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204031.09444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204031.09468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204031.09492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204031.09588: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204031.09620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204031.09639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204031.09668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.09840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.11814: stdout chunk (state=3): >>>ansible-tmp-1727204031.0842266-9561-80538599227897=/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897 <<< 8975 1727204031.12323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.12332: stderr chunk (state=3): >>><<< 8975 1727204031.12336: stdout chunk (state=3): >>><<< 8975 1727204031.12339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204031.0842266-9561-80538599227897=/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204031.12341: variable 'ansible_module_compression' from source: unknown 8975 1727204031.12344: ANSIBALLZ: Using lock for stat 8975 1727204031.12346: ANSIBALLZ: Acquiring lock 8975 1727204031.12348: ANSIBALLZ: Lock acquired: 140501805978864 8975 1727204031.12350: ANSIBALLZ: Creating module 8975 1727204031.26427: ANSIBALLZ: Writing module into payload 8975 1727204031.26536: ANSIBALLZ: Writing module 8975 1727204031.26564: ANSIBALLZ: Renaming module 8975 1727204031.26577: ANSIBALLZ: Done creating module 8975 1727204031.26596: variable 'ansible_facts' from source: unknown 8975 1727204031.26672: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py 8975 1727204031.26900: Sending initial data 8975 1727204031.26903: Sent initial data (150 bytes) 8975 1727204031.27487: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204031.27501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204031.27522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204031.27543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204031.27648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204031.27882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.29518: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8975 1727204031.29546: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 8975 1727204031.29565: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204031.29660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204031.29761: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpt_359m9x /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py <<< 8975 1727204031.29774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py" <<< 8975 1727204031.29880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpt_359m9x" to remote "/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py" <<< 8975 1727204031.30878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.30888: stdout chunk (state=3): >>><<< 8975 1727204031.30890: stderr chunk (state=3): >>><<< 8975 1727204031.30973: done transferring module to remote 8975 1727204031.30978: _low_level_execute_command(): starting 8975 1727204031.30981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/ /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py && sleep 0' 8975 1727204031.31692: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204031.31709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204031.31770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204031.31839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204031.31866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204031.31906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.31987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.33949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.33954: stdout chunk (state=3): >>><<< 8975 1727204031.33956: stderr chunk (state=3): >>><<< 8975 1727204031.33979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204031.33987: _low_level_execute_command(): starting 8975 1727204031.33997: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/AnsiballZ_stat.py && sleep 0' 8975 1727204031.34724: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204031.34854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204031.34874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204031.34911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.35028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.37357: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 8975 1727204031.37402: stdout chunk (state=3): >>>import _imp # builtin <<< 8975 1727204031.37408: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 8975 1727204031.37484: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 8975 1727204031.37514: stdout chunk (state=3): >>>import 'posix' # <<< 8975 1727204031.37554: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 8975 1727204031.37635: stdout chunk (state=3): >>>import 'time' # <<< 8975 1727204031.37638: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 8975 1727204031.37673: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 8975 1727204031.37740: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 8975 1727204031.37764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915cbb30> <<< 8975 1727204031.37803: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915feab0> <<< 8975 1727204031.37838: stdout chunk (state=3): >>>import '_signal' # <<< 8975 1727204031.37841: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 8975 1727204031.37868: stdout chunk (state=3): >>> <<< 8975 1727204031.37873: stdout chunk (state=3): >>>import 'io' # <<< 8975 1727204031.37893: stdout chunk (state=3): >>>import '_stat' # <<< 8975 1727204031.37903: stdout chunk (state=3): >>>import 'stat' # <<< 8975 1727204031.37988: stdout chunk (state=3): >>>import '_collections_abc' # <<< 8975 1727204031.38014: stdout chunk (state=3): >>>import 'genericpath' # <<< 8975 1727204031.38021: stdout chunk (state=3): >>>import 'posixpath' # <<< 8975 1727204031.38043: stdout chunk (state=3): >>>import 'os' # <<< 8975 1727204031.38066: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 8975 1727204031.38072: stdout chunk (state=3): >>>Processing user site-packages <<< 8975 1727204031.38099: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 8975 1727204031.38109: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 8975 1727204031.38114: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 8975 1727204031.38148: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 8975 1727204031.38174: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913b11c0> <<< 8975 1727204031.38232: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 8975 1727204031.38248: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.38253: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913b20c0> <<< 8975 1727204031.38282: stdout chunk (state=3): >>>import 'site' # <<< 8975 1727204031.38309: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 8975 1727204031.38555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 8975 1727204031.38564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 8975 1727204031.38592: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 8975 1727204031.38595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.38616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 8975 1727204031.38654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 8975 1727204031.38678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 8975 1727204031.38696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 8975 1727204031.38714: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913effe0> <<< 8975 1727204031.38732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 8975 1727204031.38752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 8975 1727204031.38778: stdout chunk (state=3): >>>import '_operator' # <<< 8975 1727204031.38783: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291404170> <<< 8975 1727204031.38799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 8975 1727204031.38824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 8975 1727204031.38849: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 8975 1727204031.38904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.38914: stdout chunk (state=3): >>>import 'itertools' # <<< 8975 1727204031.38940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 8975 1727204031.38951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 8975 1727204031.38972: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914279b0> <<< 8975 1727204031.38975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 8975 1727204031.38985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291427f80> <<< 8975 1727204031.39000: stdout chunk (state=3): >>>import '_collections' # <<< 8975 1727204031.39056: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291407c50> <<< 8975 1727204031.39064: stdout chunk (state=3): >>>import '_functools' # <<< 8975 1727204031.39090: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914053d0> <<< 8975 1727204031.39193: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913ed190> <<< 8975 1727204031.39213: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 8975 1727204031.39238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 8975 1727204031.39250: stdout chunk (state=3): >>>import '_sre' # <<< 8975 1727204031.39274: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 8975 1727204031.39297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 8975 1727204031.39315: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 8975 1727204031.39324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 8975 1727204031.39353: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729144b980> <<< 8975 1727204031.39368: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729144a5a0> <<< 8975 1727204031.39398: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 8975 1727204031.39414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914062a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291448d70> <<< 8975 1727204031.39466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 8975 1727204031.39483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291478a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913ec410> <<< 8975 1727204031.39500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 8975 1727204031.39542: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291478ec0> <<< 8975 1727204031.39551: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291478d70> <<< 8975 1727204031.39582: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.39592: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291479160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913eaf30> <<< 8975 1727204031.39624: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.39642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 8975 1727204031.39686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914797f0> <<< 8975 1727204031.39693: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914794f0> <<< 8975 1727204031.39696: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 8975 1727204031.39738: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 8975 1727204031.39764: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147a6f0> <<< 8975 1727204031.39774: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 8975 1727204031.39794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 8975 1727204031.39834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 8975 1727204031.39872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 8975 1727204031.39879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914948f0> <<< 8975 1727204031.39893: stdout chunk (state=3): >>>import 'errno' # <<< 8975 1727204031.39909: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.39912: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291496030> <<< 8975 1727204031.39941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 8975 1727204031.39946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 8975 1727204031.39968: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 8975 1727204031.39976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 8975 1727204031.40002: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291496ed0> <<< 8975 1727204031.40044: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291497500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291496420> <<< 8975 1727204031.40072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 8975 1727204031.40091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 8975 1727204031.40124: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291497ec0> <<< 8975 1727204031.40127: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914975f0> <<< 8975 1727204031.40180: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147a660> <<< 8975 1727204031.40199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 8975 1727204031.40222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 8975 1727204031.40256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 8975 1727204031.40259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 8975 1727204031.40290: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729126bd40> <<< 8975 1727204031.40327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 8975 1727204031.40394: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291294590> <<< 8975 1727204031.40402: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294860> <<< 8975 1727204031.40440: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291269ee0> <<< 8975 1727204031.40444: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 8975 1727204031.40541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 8975 1727204031.40569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 8975 1727204031.40589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912960f0> <<< 8975 1727204031.40622: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291294d70> <<< 8975 1727204031.40625: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147ae10> <<< 8975 1727204031.40647: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 8975 1727204031.40708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.40725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 8975 1727204031.40759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 8975 1727204031.40791: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912be4b0> <<< 8975 1727204031.40858: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 8975 1727204031.40861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.40895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 8975 1727204031.40899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 8975 1727204031.40977: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912da5a0> <<< 8975 1727204031.40980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 8975 1727204031.41002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 8975 1727204031.41063: stdout chunk (state=3): >>>import 'ntpath' # <<< 8975 1727204031.41097: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 8975 1727204031.41103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291313350> <<< 8975 1727204031.41114: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 8975 1727204031.41148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 8975 1727204031.41173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 8975 1727204031.41214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 8975 1727204031.41306: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291339af0> <<< 8975 1727204031.41383: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291313470> <<< 8975 1727204031.41419: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912db200> <<< 8975 1727204031.41457: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 8975 1727204031.41463: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911144a0> <<< 8975 1727204031.41475: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912d95e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291297050> <<< 8975 1727204031.41574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 8975 1727204031.41599: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7291114740> <<< 8975 1727204031.41677: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_45hh6s89/ansible_stat_payload.zip' <<< 8975 1727204031.41684: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.41826: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.41859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 8975 1727204031.41862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 8975 1727204031.41909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 8975 1727204031.41981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 8975 1727204031.42022: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 8975 1727204031.42027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729116e2a0> import '_typing' # <<< 8975 1727204031.42222: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291145190> <<< 8975 1727204031.42231: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911442f0> # zipimport: zlib available <<< 8975 1727204031.42262: stdout chunk (state=3): >>>import 'ansible' # <<< 8975 1727204031.42280: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.42289: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.42292: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.42325: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 8975 1727204031.42329: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.43883: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.45157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911476b0> <<< 8975 1727204031.45162: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.45217: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 8975 1727204031.45234: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291199ca0> <<< 8975 1727204031.45289: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199a30> <<< 8975 1727204031.45326: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 8975 1727204031.45384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 8975 1727204031.45534: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729116ecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.45808: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729119a9c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729119abd0> <<< 8975 1727204031.45813: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729119b050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290ffcd70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ffe960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290fff320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910004d0> <<< 8975 1727204031.45833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 8975 1727204031.45859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 8975 1727204031.45889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 8975 1727204031.45941: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291002fc0> <<< 8975 1727204031.46080: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.46105: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291003110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910012b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 8975 1727204031.46133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 8975 1727204031.46181: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291006ff0> import '_tokenize' # <<< 8975 1727204031.46234: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291005ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291005820> <<< 8975 1727204031.46262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 8975 1727204031.46344: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291007f80> <<< 8975 1727204031.46376: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910017c0> <<< 8975 1727204031.46402: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729104f170> <<< 8975 1727204031.46430: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729104f2c0> <<< 8975 1727204031.46454: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 8975 1727204031.46481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 8975 1727204031.46502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 8975 1727204031.46534: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291050ec0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291050c80> <<< 8975 1727204031.46630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 8975 1727204031.46662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 8975 1727204031.46701: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291053350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291051520> <<< 8975 1727204031.46719: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 8975 1727204031.46760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.46792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 8975 1727204031.46841: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105ab10> <<< 8975 1727204031.46977: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910534d0> <<< 8975 1727204031.47044: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105b980> <<< 8975 1727204031.47076: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105b7d0> <<< 8975 1727204031.47120: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105bf20> <<< 8975 1727204031.47143: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729104f590> <<< 8975 1727204031.47169: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 8975 1727204031.47178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 8975 1727204031.47198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 8975 1727204031.47232: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.47253: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105f620> <<< 8975 1727204031.47409: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.47420: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f72910609b0> <<< 8975 1727204031.47429: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105ddf0> <<< 8975 1727204031.47469: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105f140> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105da00> <<< 8975 1727204031.47482: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47497: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 8975 1727204031.47507: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47604: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47696: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47709: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47718: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 8975 1727204031.47738: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47754: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 8975 1727204031.47764: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.47895: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.48021: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.48613: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.49431: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 8975 1727204031.49435: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f72910e8aa0> <<< 8975 1727204031.49437: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910e9820> <<< 8975 1727204031.49449: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291060cb0> <<< 8975 1727204031.49482: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 8975 1727204031.49502: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.49527: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 8975 1727204031.49542: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.49701: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.49868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 8975 1727204031.49892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910e9970> # zipimport: zlib available <<< 8975 1727204031.50404: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.50896: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.50977: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51047: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 8975 1727204031.51064: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51096: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51139: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 8975 1727204031.51143: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51224: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51311: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 8975 1727204031.51331: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51334: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51354: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 8975 1727204031.51359: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51404: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51433: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 8975 1727204031.51455: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51704: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.51964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 8975 1727204031.52027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 8975 1727204031.52037: stdout chunk (state=3): >>>import '_ast' # <<< 8975 1727204031.52105: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910ea930> <<< 8975 1727204031.52114: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52186: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52270: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 8975 1727204031.52278: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 8975 1727204031.52288: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 8975 1727204031.52304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 8975 1727204031.52312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 8975 1727204031.52400: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 8975 1727204031.52515: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef61e0> <<< 8975 1727204031.52583: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef6b10> <<< 8975 1727204031.52603: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291063dd0> # zipimport: zlib available <<< 8975 1727204031.52649: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52698: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 8975 1727204031.52702: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52734: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52791: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52841: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.52913: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 8975 1727204031.52945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.53031: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef5940> <<< 8975 1727204031.53087: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290ef6d50> <<< 8975 1727204031.53118: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 8975 1727204031.53189: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53255: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53290: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53327: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 8975 1727204031.53338: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 8975 1727204031.53348: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 8975 1727204031.53376: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 8975 1727204031.53389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 8975 1727204031.53443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 8975 1727204031.53465: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 8975 1727204031.53482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 8975 1727204031.53537: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290f86e10> <<< 8975 1727204031.53587: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290f00b90> <<< 8975 1727204031.53670: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290efed80> <<< 8975 1727204031.53674: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290efebd0> # destroy ansible.module_utils.distro <<< 8975 1727204031.53695: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 8975 1727204031.53699: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53710: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53739: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 8975 1727204031.53744: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 8975 1727204031.53803: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 8975 1727204031.53807: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53825: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 8975 1727204031.53846: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.53983: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.54192: stdout chunk (state=3): >>># zipimport: zlib available <<< 8975 1727204031.54319: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8975 1727204031.54326: stdout chunk (state=3): >>># destroy __main__ <<< 8975 1727204031.54636: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 8975 1727204031.54639: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 8975 1727204031.54667: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 8975 1727204031.54690: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 8975 1727204031.54715: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse <<< 8975 1727204031.54721: stdout chunk (state=3): >>># cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner <<< 8975 1727204031.54743: stdout chunk (state=3): >>># cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 8975 1727204031.54760: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 8975 1727204031.54791: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 8975 1727204031.54797: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 8975 1727204031.55026: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 8975 1727204031.55041: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 8975 1727204031.55057: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 8975 1727204031.55060: stdout chunk (state=3): >>># destroy binascii <<< 8975 1727204031.55073: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 8975 1727204031.55088: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 8975 1727204031.55105: stdout chunk (state=3): >>># destroy ntpath <<< 8975 1727204031.55132: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 8975 1727204031.55139: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 8975 1727204031.55163: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 8975 1727204031.55177: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 8975 1727204031.55198: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array <<< 8975 1727204031.55219: stdout chunk (state=3): >>># destroy datetime <<< 8975 1727204031.55226: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 8975 1727204031.55244: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 8975 1727204031.55286: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 8975 1727204031.55304: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 8975 1727204031.55321: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 8975 1727204031.55348: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 8975 1727204031.55352: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 8975 1727204031.55371: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 8975 1727204031.55386: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 8975 1727204031.55393: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 8975 1727204031.55408: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 8975 1727204031.55424: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 8975 1727204031.55430: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 8975 1727204031.55561: stdout chunk (state=3): >>># destroy sys.monitoring <<< 8975 1727204031.55567: stdout chunk (state=3): >>># destroy _socket <<< 8975 1727204031.55580: stdout chunk (state=3): >>># destroy _collections <<< 8975 1727204031.55601: stdout chunk (state=3): >>># destroy platform <<< 8975 1727204031.55614: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 8975 1727204031.55636: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 8975 1727204031.55641: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 8975 1727204031.55675: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 8975 1727204031.55699: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 8975 1727204031.55709: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 8975 1727204031.55793: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 8975 1727204031.55807: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 8975 1727204031.55846: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 8975 1727204031.55874: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 8975 1727204031.55891: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 8975 1727204031.56379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204031.56382: stdout chunk (state=3): >>><<< 8975 1727204031.56384: stderr chunk (state=3): >>><<< 8975 1727204031.56434: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72915feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913b11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913b20c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913effe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291404170> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914279b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291427f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291407c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914053d0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913ed190> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729144b980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729144a5a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914062a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291448d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291478a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913ec410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291478ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291478d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291479160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72913eaf30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914797f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914794f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147a6f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914948f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291496030> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291496ed0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291497500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291496420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291497ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72914975f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147a660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729126bd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291294590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291294a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291269ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912960f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291294d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729147ae10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912be4b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912da5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291313350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291339af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291313470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912db200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911144a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72912d95e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291297050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7291114740> # zipimport: found 30 names in '/tmp/ansible_stat_payload_45hh6s89/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729116e2a0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291145190> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911442f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72911476b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291199ca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199a30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291199d90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729116ecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729119a9c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729119abd0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729119b050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290ffcd70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ffe960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290fff320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910004d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291002fc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291003110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910012b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291006ff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291005ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291005820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291007f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910017c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729104f170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729104f2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291050ec0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291050c80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7291053350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291051520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105ab10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910534d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105b980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105b7d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105bf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729104f590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105f620> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f72910609b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105ddf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f729105f140> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f729105da00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f72910e8aa0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910e9820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291060cb0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910e9970> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f72910ea930> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef61e0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef6b10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7291063dd0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7290ef5940> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290ef6d50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290f86e10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290f00b90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290efed80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7290efebd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 8975 1727204031.57154: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204031.57158: _low_level_execute_command(): starting 8975 1727204031.57160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204031.0842266-9561-80538599227897/ > /dev/null 2>&1 && sleep 0' 8975 1727204031.57540: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204031.57555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204031.57571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204031.57640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204031.57701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204031.57709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.57795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.59722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.59782: stderr chunk (state=3): >>><<< 8975 1727204031.59786: stdout chunk (state=3): >>><<< 8975 1727204031.59800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204031.59806: handler run complete 8975 1727204031.59826: attempt loop complete, returning result 8975 1727204031.59829: _execute() done 8975 1727204031.59831: dumping result to json 8975 1727204031.59834: done dumping result, returning 8975 1727204031.59842: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [127b8e07-fff9-9356-306d-0000000000e0] 8975 1727204031.59847: sending task result for task 127b8e07-fff9-9356-306d-0000000000e0 8975 1727204031.59948: done sending task result for task 127b8e07-fff9-9356-306d-0000000000e0 8975 1727204031.59951: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 8975 1727204031.60017: no more pending results, returning what we have 8975 1727204031.60020: results queue empty 8975 1727204031.60023: checking for any_errors_fatal 8975 1727204031.60030: done checking for any_errors_fatal 8975 1727204031.60031: checking for max_fail_percentage 8975 1727204031.60032: done checking for max_fail_percentage 8975 1727204031.60033: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.60034: done checking to see if all hosts have failed 8975 1727204031.60035: getting the remaining hosts for this loop 8975 1727204031.60037: done getting the remaining hosts for this loop 8975 1727204031.60041: getting the next task for host managed-node2 8975 1727204031.60047: done getting next task for host managed-node2 8975 1727204031.60050: ^ task is: TASK: Set flag to indicate system is ostree 8975 1727204031.60053: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.60056: getting variables 8975 1727204031.60058: in VariableManager get_vars() 8975 1727204031.60091: Calling all_inventory to load vars for managed-node2 8975 1727204031.60094: Calling groups_inventory to load vars for managed-node2 8975 1727204031.60098: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.60109: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.60112: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.60115: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.60275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.60396: done with get_vars() 8975 1727204031.60405: done getting variables 8975 1727204031.60481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.625) 0:00:02.921 ***** 8975 1727204031.60504: entering _queue_task() for managed-node2/set_fact 8975 1727204031.60505: Creating lock for set_fact 8975 1727204031.60748: worker is 1 (out of 1 available) 8975 1727204031.60762: exiting _queue_task() for managed-node2/set_fact 8975 1727204031.60776: done queuing things up, now waiting for results queue to drain 8975 1727204031.60778: waiting for pending results... 8975 1727204031.60919: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 8975 1727204031.60993: in run() - task 127b8e07-fff9-9356-306d-0000000000e1 8975 1727204031.61006: variable 'ansible_search_path' from source: unknown 8975 1727204031.61010: variable 'ansible_search_path' from source: unknown 8975 1727204031.61041: calling self._execute() 8975 1727204031.61103: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.61109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.61117: variable 'omit' from source: magic vars 8975 1727204031.61532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204031.61720: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204031.61756: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204031.61785: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204031.61811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204031.61886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204031.61903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204031.61926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204031.61944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204031.62050: Evaluated conditional (not __network_is_ostree is defined): True 8975 1727204031.62056: variable 'omit' from source: magic vars 8975 1727204031.62086: variable 'omit' from source: magic vars 8975 1727204031.62181: variable '__ostree_booted_stat' from source: set_fact 8975 1727204031.62223: variable 'omit' from source: magic vars 8975 1727204031.62245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204031.62270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204031.62284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204031.62299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.62309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.62337: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204031.62340: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.62343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.62417: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204031.62425: Set connection var ansible_connection to ssh 8975 1727204031.62428: Set connection var ansible_shell_executable to /bin/sh 8975 1727204031.62430: Set connection var ansible_timeout to 10 8975 1727204031.62433: Set connection var ansible_shell_type to sh 8975 1727204031.62444: Set connection var ansible_pipelining to False 8975 1727204031.62462: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.62467: variable 'ansible_connection' from source: unknown 8975 1727204031.62470: variable 'ansible_module_compression' from source: unknown 8975 1727204031.62472: variable 'ansible_shell_type' from source: unknown 8975 1727204031.62474: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.62477: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.62481: variable 'ansible_pipelining' from source: unknown 8975 1727204031.62484: variable 'ansible_timeout' from source: unknown 8975 1727204031.62488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.62567: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204031.62576: variable 'omit' from source: magic vars 8975 1727204031.62581: starting attempt loop 8975 1727204031.62584: running the handler 8975 1727204031.62594: handler run complete 8975 1727204031.62601: attempt loop complete, returning result 8975 1727204031.62604: _execute() done 8975 1727204031.62606: dumping result to json 8975 1727204031.62609: done dumping result, returning 8975 1727204031.62616: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [127b8e07-fff9-9356-306d-0000000000e1] 8975 1727204031.62623: sending task result for task 127b8e07-fff9-9356-306d-0000000000e1 8975 1727204031.62710: done sending task result for task 127b8e07-fff9-9356-306d-0000000000e1 8975 1727204031.62713: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 8975 1727204031.62773: no more pending results, returning what we have 8975 1727204031.62775: results queue empty 8975 1727204031.62776: checking for any_errors_fatal 8975 1727204031.62783: done checking for any_errors_fatal 8975 1727204031.62784: checking for max_fail_percentage 8975 1727204031.62786: done checking for max_fail_percentage 8975 1727204031.62786: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.62787: done checking to see if all hosts have failed 8975 1727204031.62788: getting the remaining hosts for this loop 8975 1727204031.62790: done getting the remaining hosts for this loop 8975 1727204031.62794: getting the next task for host managed-node2 8975 1727204031.62804: done getting next task for host managed-node2 8975 1727204031.62806: ^ task is: TASK: Fix CentOS6 Base repo 8975 1727204031.62808: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.62812: getting variables 8975 1727204031.62814: in VariableManager get_vars() 8975 1727204031.62845: Calling all_inventory to load vars for managed-node2 8975 1727204031.62848: Calling groups_inventory to load vars for managed-node2 8975 1727204031.62851: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.62860: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.62863: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.62879: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.63051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.63176: done with get_vars() 8975 1727204031.63184: done getting variables 8975 1727204031.63283: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.027) 0:00:02.950 ***** 8975 1727204031.63307: entering _queue_task() for managed-node2/copy 8975 1727204031.63558: worker is 1 (out of 1 available) 8975 1727204031.63571: exiting _queue_task() for managed-node2/copy 8975 1727204031.63584: done queuing things up, now waiting for results queue to drain 8975 1727204031.63586: waiting for pending results... 8975 1727204031.63743: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 8975 1727204031.63816: in run() - task 127b8e07-fff9-9356-306d-0000000000e3 8975 1727204031.63830: variable 'ansible_search_path' from source: unknown 8975 1727204031.63833: variable 'ansible_search_path' from source: unknown 8975 1727204031.63863: calling self._execute() 8975 1727204031.63928: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.63934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.63944: variable 'omit' from source: magic vars 8975 1727204031.64307: variable 'ansible_distribution' from source: facts 8975 1727204031.64327: Evaluated conditional (ansible_distribution == 'CentOS'): False 8975 1727204031.64330: when evaluation is False, skipping this task 8975 1727204031.64333: _execute() done 8975 1727204031.64336: dumping result to json 8975 1727204031.64338: done dumping result, returning 8975 1727204031.64344: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [127b8e07-fff9-9356-306d-0000000000e3] 8975 1727204031.64349: sending task result for task 127b8e07-fff9-9356-306d-0000000000e3 8975 1727204031.64452: done sending task result for task 127b8e07-fff9-9356-306d-0000000000e3 8975 1727204031.64455: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 8975 1727204031.64531: no more pending results, returning what we have 8975 1727204031.64534: results queue empty 8975 1727204031.64535: checking for any_errors_fatal 8975 1727204031.64539: done checking for any_errors_fatal 8975 1727204031.64540: checking for max_fail_percentage 8975 1727204031.64541: done checking for max_fail_percentage 8975 1727204031.64542: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.64543: done checking to see if all hosts have failed 8975 1727204031.64544: getting the remaining hosts for this loop 8975 1727204031.64545: done getting the remaining hosts for this loop 8975 1727204031.64550: getting the next task for host managed-node2 8975 1727204031.64556: done getting next task for host managed-node2 8975 1727204031.64559: ^ task is: TASK: Include the task 'enable_epel.yml' 8975 1727204031.64562: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.64567: getting variables 8975 1727204031.64568: in VariableManager get_vars() 8975 1727204031.64595: Calling all_inventory to load vars for managed-node2 8975 1727204031.64597: Calling groups_inventory to load vars for managed-node2 8975 1727204031.64600: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.64610: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.64612: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.64615: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.64743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.64888: done with get_vars() 8975 1727204031.64895: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.016) 0:00:02.966 ***** 8975 1727204031.64981: entering _queue_task() for managed-node2/include_tasks 8975 1727204031.65214: worker is 1 (out of 1 available) 8975 1727204031.65230: exiting _queue_task() for managed-node2/include_tasks 8975 1727204031.65244: done queuing things up, now waiting for results queue to drain 8975 1727204031.65245: waiting for pending results... 8975 1727204031.65394: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 8975 1727204031.65469: in run() - task 127b8e07-fff9-9356-306d-0000000000e4 8975 1727204031.65480: variable 'ansible_search_path' from source: unknown 8975 1727204031.65484: variable 'ansible_search_path' from source: unknown 8975 1727204031.65515: calling self._execute() 8975 1727204031.65574: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.65581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.65589: variable 'omit' from source: magic vars 8975 1727204031.65966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204031.67597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204031.67648: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204031.67680: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204031.67708: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204031.67729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204031.67798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204031.67824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204031.67842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204031.67872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204031.67883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204031.67977: variable '__network_is_ostree' from source: set_fact 8975 1727204031.67991: Evaluated conditional (not __network_is_ostree | d(false)): True 8975 1727204031.67997: _execute() done 8975 1727204031.68001: dumping result to json 8975 1727204031.68004: done dumping result, returning 8975 1727204031.68012: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-9356-306d-0000000000e4] 8975 1727204031.68017: sending task result for task 127b8e07-fff9-9356-306d-0000000000e4 8975 1727204031.68117: done sending task result for task 127b8e07-fff9-9356-306d-0000000000e4 8975 1727204031.68119: WORKER PROCESS EXITING 8975 1727204031.68155: no more pending results, returning what we have 8975 1727204031.68163: in VariableManager get_vars() 8975 1727204031.68201: Calling all_inventory to load vars for managed-node2 8975 1727204031.68204: Calling groups_inventory to load vars for managed-node2 8975 1727204031.68208: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.68220: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.68225: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.68228: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.68386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.68505: done with get_vars() 8975 1727204031.68512: variable 'ansible_search_path' from source: unknown 8975 1727204031.68512: variable 'ansible_search_path' from source: unknown 8975 1727204031.68543: we have included files to process 8975 1727204031.68544: generating all_blocks data 8975 1727204031.68545: done generating all_blocks data 8975 1727204031.68550: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8975 1727204031.68552: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8975 1727204031.68553: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 8975 1727204031.69063: done processing included file 8975 1727204031.69067: iterating over new_blocks loaded from include file 8975 1727204031.69069: in VariableManager get_vars() 8975 1727204031.69078: done with get_vars() 8975 1727204031.69079: filtering new block on tags 8975 1727204031.69095: done filtering new block on tags 8975 1727204031.69097: in VariableManager get_vars() 8975 1727204031.69103: done with get_vars() 8975 1727204031.69104: filtering new block on tags 8975 1727204031.69111: done filtering new block on tags 8975 1727204031.69113: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 8975 1727204031.69118: extending task lists for all hosts with included blocks 8975 1727204031.69189: done extending task lists 8975 1727204031.69190: done processing included files 8975 1727204031.69190: results queue empty 8975 1727204031.69191: checking for any_errors_fatal 8975 1727204031.69194: done checking for any_errors_fatal 8975 1727204031.69194: checking for max_fail_percentage 8975 1727204031.69195: done checking for max_fail_percentage 8975 1727204031.69195: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.69196: done checking to see if all hosts have failed 8975 1727204031.69196: getting the remaining hosts for this loop 8975 1727204031.69197: done getting the remaining hosts for this loop 8975 1727204031.69199: getting the next task for host managed-node2 8975 1727204031.69202: done getting next task for host managed-node2 8975 1727204031.69204: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 8975 1727204031.69206: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.69208: getting variables 8975 1727204031.69209: in VariableManager get_vars() 8975 1727204031.69215: Calling all_inventory to load vars for managed-node2 8975 1727204031.69216: Calling groups_inventory to load vars for managed-node2 8975 1727204031.69218: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.69225: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.69232: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.69234: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.69338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.69454: done with get_vars() 8975 1727204031.69460: done getting variables 8975 1727204031.69519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8975 1727204031.69681: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.047) 0:00:03.014 ***** 8975 1727204031.69718: entering _queue_task() for managed-node2/command 8975 1727204031.69719: Creating lock for command 8975 1727204031.69985: worker is 1 (out of 1 available) 8975 1727204031.69998: exiting _queue_task() for managed-node2/command 8975 1727204031.70013: done queuing things up, now waiting for results queue to drain 8975 1727204031.70015: waiting for pending results... 8975 1727204031.70175: running TaskExecutor() for managed-node2/TASK: Create EPEL 40 8975 1727204031.70259: in run() - task 127b8e07-fff9-9356-306d-0000000000fe 8975 1727204031.70279: variable 'ansible_search_path' from source: unknown 8975 1727204031.70283: variable 'ansible_search_path' from source: unknown 8975 1727204031.70314: calling self._execute() 8975 1727204031.70382: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.70386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.70394: variable 'omit' from source: magic vars 8975 1727204031.70701: variable 'ansible_distribution' from source: facts 8975 1727204031.70710: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8975 1727204031.70713: when evaluation is False, skipping this task 8975 1727204031.70716: _execute() done 8975 1727204031.70719: dumping result to json 8975 1727204031.70721: done dumping result, returning 8975 1727204031.70731: done running TaskExecutor() for managed-node2/TASK: Create EPEL 40 [127b8e07-fff9-9356-306d-0000000000fe] 8975 1727204031.70736: sending task result for task 127b8e07-fff9-9356-306d-0000000000fe 8975 1727204031.70844: done sending task result for task 127b8e07-fff9-9356-306d-0000000000fe 8975 1727204031.70847: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8975 1727204031.70903: no more pending results, returning what we have 8975 1727204031.70907: results queue empty 8975 1727204031.70908: checking for any_errors_fatal 8975 1727204031.70909: done checking for any_errors_fatal 8975 1727204031.70910: checking for max_fail_percentage 8975 1727204031.70911: done checking for max_fail_percentage 8975 1727204031.70912: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.70914: done checking to see if all hosts have failed 8975 1727204031.70914: getting the remaining hosts for this loop 8975 1727204031.70916: done getting the remaining hosts for this loop 8975 1727204031.70920: getting the next task for host managed-node2 8975 1727204031.70928: done getting next task for host managed-node2 8975 1727204031.70931: ^ task is: TASK: Install yum-utils package 8975 1727204031.70934: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.70938: getting variables 8975 1727204031.70940: in VariableManager get_vars() 8975 1727204031.70975: Calling all_inventory to load vars for managed-node2 8975 1727204031.70978: Calling groups_inventory to load vars for managed-node2 8975 1727204031.70981: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.70992: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.70994: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.70997: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.71136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.71297: done with get_vars() 8975 1727204031.71304: done getting variables 8975 1727204031.71386: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.016) 0:00:03.031 ***** 8975 1727204031.71410: entering _queue_task() for managed-node2/package 8975 1727204031.71411: Creating lock for package 8975 1727204031.71671: worker is 1 (out of 1 available) 8975 1727204031.71684: exiting _queue_task() for managed-node2/package 8975 1727204031.71699: done queuing things up, now waiting for results queue to drain 8975 1727204031.71700: waiting for pending results... 8975 1727204031.71878: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 8975 1727204031.71959: in run() - task 127b8e07-fff9-9356-306d-0000000000ff 8975 1727204031.71971: variable 'ansible_search_path' from source: unknown 8975 1727204031.71975: variable 'ansible_search_path' from source: unknown 8975 1727204031.72007: calling self._execute() 8975 1727204031.72076: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.72083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.72092: variable 'omit' from source: magic vars 8975 1727204031.72416: variable 'ansible_distribution' from source: facts 8975 1727204031.72431: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8975 1727204031.72435: when evaluation is False, skipping this task 8975 1727204031.72437: _execute() done 8975 1727204031.72440: dumping result to json 8975 1727204031.72443: done dumping result, returning 8975 1727204031.72446: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [127b8e07-fff9-9356-306d-0000000000ff] 8975 1727204031.72451: sending task result for task 127b8e07-fff9-9356-306d-0000000000ff 8975 1727204031.72560: done sending task result for task 127b8e07-fff9-9356-306d-0000000000ff 8975 1727204031.72562: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8975 1727204031.72631: no more pending results, returning what we have 8975 1727204031.72634: results queue empty 8975 1727204031.72635: checking for any_errors_fatal 8975 1727204031.72641: done checking for any_errors_fatal 8975 1727204031.72642: checking for max_fail_percentage 8975 1727204031.72643: done checking for max_fail_percentage 8975 1727204031.72644: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.72645: done checking to see if all hosts have failed 8975 1727204031.72646: getting the remaining hosts for this loop 8975 1727204031.72648: done getting the remaining hosts for this loop 8975 1727204031.72652: getting the next task for host managed-node2 8975 1727204031.72659: done getting next task for host managed-node2 8975 1727204031.72661: ^ task is: TASK: Enable EPEL 7 8975 1727204031.72667: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.72670: getting variables 8975 1727204031.72671: in VariableManager get_vars() 8975 1727204031.72699: Calling all_inventory to load vars for managed-node2 8975 1727204031.72702: Calling groups_inventory to load vars for managed-node2 8975 1727204031.72705: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.72715: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.72718: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.72720: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.72870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.72994: done with get_vars() 8975 1727204031.73009: done getting variables 8975 1727204031.73064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.016) 0:00:03.047 ***** 8975 1727204031.73089: entering _queue_task() for managed-node2/command 8975 1727204031.73570: worker is 1 (out of 1 available) 8975 1727204031.73583: exiting _queue_task() for managed-node2/command 8975 1727204031.73597: done queuing things up, now waiting for results queue to drain 8975 1727204031.73598: waiting for pending results... 8975 1727204031.73861: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 8975 1727204031.73910: in run() - task 127b8e07-fff9-9356-306d-000000000100 8975 1727204031.73984: variable 'ansible_search_path' from source: unknown 8975 1727204031.73988: variable 'ansible_search_path' from source: unknown 8975 1727204031.73991: calling self._execute() 8975 1727204031.74199: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.74204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.74208: variable 'omit' from source: magic vars 8975 1727204031.74614: variable 'ansible_distribution' from source: facts 8975 1727204031.74637: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8975 1727204031.74645: when evaluation is False, skipping this task 8975 1727204031.74971: _execute() done 8975 1727204031.74976: dumping result to json 8975 1727204031.74979: done dumping result, returning 8975 1727204031.74982: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [127b8e07-fff9-9356-306d-000000000100] 8975 1727204031.74985: sending task result for task 127b8e07-fff9-9356-306d-000000000100 8975 1727204031.75113: done sending task result for task 127b8e07-fff9-9356-306d-000000000100 8975 1727204031.75116: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8975 1727204031.75191: no more pending results, returning what we have 8975 1727204031.75195: results queue empty 8975 1727204031.75196: checking for any_errors_fatal 8975 1727204031.75204: done checking for any_errors_fatal 8975 1727204031.75205: checking for max_fail_percentage 8975 1727204031.75207: done checking for max_fail_percentage 8975 1727204031.75208: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.75210: done checking to see if all hosts have failed 8975 1727204031.75211: getting the remaining hosts for this loop 8975 1727204031.75213: done getting the remaining hosts for this loop 8975 1727204031.75217: getting the next task for host managed-node2 8975 1727204031.75229: done getting next task for host managed-node2 8975 1727204031.75231: ^ task is: TASK: Enable EPEL 8 8975 1727204031.75235: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.75239: getting variables 8975 1727204031.75242: in VariableManager get_vars() 8975 1727204031.75294: Calling all_inventory to load vars for managed-node2 8975 1727204031.75301: Calling groups_inventory to load vars for managed-node2 8975 1727204031.75313: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.75338: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.75342: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.75347: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.75902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.76140: done with get_vars() 8975 1727204031.76154: done getting variables 8975 1727204031.76227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.031) 0:00:03.079 ***** 8975 1727204031.76279: entering _queue_task() for managed-node2/command 8975 1727204031.76662: worker is 1 (out of 1 available) 8975 1727204031.76860: exiting _queue_task() for managed-node2/command 8975 1727204031.76882: done queuing things up, now waiting for results queue to drain 8975 1727204031.76883: waiting for pending results... 8975 1727204031.77136: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 8975 1727204031.77331: in run() - task 127b8e07-fff9-9356-306d-000000000101 8975 1727204031.77339: variable 'ansible_search_path' from source: unknown 8975 1727204031.77344: variable 'ansible_search_path' from source: unknown 8975 1727204031.77347: calling self._execute() 8975 1727204031.77480: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.77484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.77507: variable 'omit' from source: magic vars 8975 1727204031.77941: variable 'ansible_distribution' from source: facts 8975 1727204031.77971: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8975 1727204031.77979: when evaluation is False, skipping this task 8975 1727204031.77984: _execute() done 8975 1727204031.77987: dumping result to json 8975 1727204031.77990: done dumping result, returning 8975 1727204031.77993: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [127b8e07-fff9-9356-306d-000000000101] 8975 1727204031.78017: sending task result for task 127b8e07-fff9-9356-306d-000000000101 8975 1727204031.78124: done sending task result for task 127b8e07-fff9-9356-306d-000000000101 8975 1727204031.78127: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8975 1727204031.78203: no more pending results, returning what we have 8975 1727204031.78206: results queue empty 8975 1727204031.78207: checking for any_errors_fatal 8975 1727204031.78211: done checking for any_errors_fatal 8975 1727204031.78212: checking for max_fail_percentage 8975 1727204031.78214: done checking for max_fail_percentage 8975 1727204031.78215: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.78216: done checking to see if all hosts have failed 8975 1727204031.78217: getting the remaining hosts for this loop 8975 1727204031.78218: done getting the remaining hosts for this loop 8975 1727204031.78223: getting the next task for host managed-node2 8975 1727204031.78233: done getting next task for host managed-node2 8975 1727204031.78235: ^ task is: TASK: Enable EPEL 6 8975 1727204031.78239: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.78245: getting variables 8975 1727204031.78246: in VariableManager get_vars() 8975 1727204031.78277: Calling all_inventory to load vars for managed-node2 8975 1727204031.78279: Calling groups_inventory to load vars for managed-node2 8975 1727204031.78282: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.78293: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.78295: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.78298: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.78471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.78633: done with get_vars() 8975 1727204031.78648: done getting variables 8975 1727204031.78711: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.024) 0:00:03.104 ***** 8975 1727204031.78742: entering _queue_task() for managed-node2/copy 8975 1727204031.79057: worker is 1 (out of 1 available) 8975 1727204031.79072: exiting _queue_task() for managed-node2/copy 8975 1727204031.79087: done queuing things up, now waiting for results queue to drain 8975 1727204031.79089: waiting for pending results... 8975 1727204031.79387: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 8975 1727204031.79572: in run() - task 127b8e07-fff9-9356-306d-000000000103 8975 1727204031.79576: variable 'ansible_search_path' from source: unknown 8975 1727204031.79579: variable 'ansible_search_path' from source: unknown 8975 1727204031.79582: calling self._execute() 8975 1727204031.79641: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.79654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.79671: variable 'omit' from source: magic vars 8975 1727204031.80109: variable 'ansible_distribution' from source: facts 8975 1727204031.80125: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 8975 1727204031.80130: when evaluation is False, skipping this task 8975 1727204031.80146: _execute() done 8975 1727204031.80153: dumping result to json 8975 1727204031.80157: done dumping result, returning 8975 1727204031.80169: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [127b8e07-fff9-9356-306d-000000000103] 8975 1727204031.80172: sending task result for task 127b8e07-fff9-9356-306d-000000000103 8975 1727204031.80279: done sending task result for task 127b8e07-fff9-9356-306d-000000000103 8975 1727204031.80284: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 8975 1727204031.80342: no more pending results, returning what we have 8975 1727204031.80345: results queue empty 8975 1727204031.80346: checking for any_errors_fatal 8975 1727204031.80351: done checking for any_errors_fatal 8975 1727204031.80352: checking for max_fail_percentage 8975 1727204031.80354: done checking for max_fail_percentage 8975 1727204031.80354: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.80356: done checking to see if all hosts have failed 8975 1727204031.80356: getting the remaining hosts for this loop 8975 1727204031.80358: done getting the remaining hosts for this loop 8975 1727204031.80367: getting the next task for host managed-node2 8975 1727204031.80377: done getting next task for host managed-node2 8975 1727204031.80380: ^ task is: TASK: Set network provider to 'nm' 8975 1727204031.80382: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.80386: getting variables 8975 1727204031.80387: in VariableManager get_vars() 8975 1727204031.80418: Calling all_inventory to load vars for managed-node2 8975 1727204031.80420: Calling groups_inventory to load vars for managed-node2 8975 1727204031.80423: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.80434: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.80436: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.80439: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.80613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.80733: done with get_vars() 8975 1727204031.80741: done getting variables 8975 1727204031.80812: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.020) 0:00:03.125 ***** 8975 1727204031.80837: entering _queue_task() for managed-node2/set_fact 8975 1727204031.81102: worker is 1 (out of 1 available) 8975 1727204031.81117: exiting _queue_task() for managed-node2/set_fact 8975 1727204031.81134: done queuing things up, now waiting for results queue to drain 8975 1727204031.81135: waiting for pending results... 8975 1727204031.81294: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 8975 1727204031.81357: in run() - task 127b8e07-fff9-9356-306d-000000000007 8975 1727204031.81371: variable 'ansible_search_path' from source: unknown 8975 1727204031.81413: calling self._execute() 8975 1727204031.81485: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.81491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.81500: variable 'omit' from source: magic vars 8975 1727204031.81588: variable 'omit' from source: magic vars 8975 1727204031.81615: variable 'omit' from source: magic vars 8975 1727204031.81644: variable 'omit' from source: magic vars 8975 1727204031.81693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204031.81727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204031.81744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204031.81760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.81772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.81800: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204031.81803: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.81806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.82018: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204031.82021: Set connection var ansible_connection to ssh 8975 1727204031.82023: Set connection var ansible_shell_executable to /bin/sh 8975 1727204031.82026: Set connection var ansible_timeout to 10 8975 1727204031.82028: Set connection var ansible_shell_type to sh 8975 1727204031.82030: Set connection var ansible_pipelining to False 8975 1727204031.82032: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.82034: variable 'ansible_connection' from source: unknown 8975 1727204031.82036: variable 'ansible_module_compression' from source: unknown 8975 1727204031.82039: variable 'ansible_shell_type' from source: unknown 8975 1727204031.82041: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.82043: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.82045: variable 'ansible_pipelining' from source: unknown 8975 1727204031.82047: variable 'ansible_timeout' from source: unknown 8975 1727204031.82050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.82284: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204031.82302: variable 'omit' from source: magic vars 8975 1727204031.82311: starting attempt loop 8975 1727204031.82318: running the handler 8975 1727204031.82336: handler run complete 8975 1727204031.82356: attempt loop complete, returning result 8975 1727204031.82364: _execute() done 8975 1727204031.82373: dumping result to json 8975 1727204031.82381: done dumping result, returning 8975 1727204031.82391: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [127b8e07-fff9-9356-306d-000000000007] 8975 1727204031.82399: sending task result for task 127b8e07-fff9-9356-306d-000000000007 ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 8975 1727204031.82664: no more pending results, returning what we have 8975 1727204031.82825: results queue empty 8975 1727204031.82827: checking for any_errors_fatal 8975 1727204031.82833: done checking for any_errors_fatal 8975 1727204031.82834: checking for max_fail_percentage 8975 1727204031.82836: done checking for max_fail_percentage 8975 1727204031.82837: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.82838: done checking to see if all hosts have failed 8975 1727204031.82838: getting the remaining hosts for this loop 8975 1727204031.82840: done getting the remaining hosts for this loop 8975 1727204031.82848: getting the next task for host managed-node2 8975 1727204031.82855: done getting next task for host managed-node2 8975 1727204031.82857: ^ task is: TASK: meta (flush_handlers) 8975 1727204031.82859: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.82863: getting variables 8975 1727204031.82864: in VariableManager get_vars() 8975 1727204031.82933: Calling all_inventory to load vars for managed-node2 8975 1727204031.82937: Calling groups_inventory to load vars for managed-node2 8975 1727204031.82940: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.82950: done sending task result for task 127b8e07-fff9-9356-306d-000000000007 8975 1727204031.82953: WORKER PROCESS EXITING 8975 1727204031.83080: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.83083: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.83087: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.83498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.83974: done with get_vars() 8975 1727204031.83990: done getting variables 8975 1727204031.84123: in VariableManager get_vars() 8975 1727204031.84252: Calling all_inventory to load vars for managed-node2 8975 1727204031.84256: Calling groups_inventory to load vars for managed-node2 8975 1727204031.84259: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.84267: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.84270: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.84273: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.85178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.85669: done with get_vars() 8975 1727204031.85695: done queuing things up, now waiting for results queue to drain 8975 1727204031.85697: results queue empty 8975 1727204031.85698: checking for any_errors_fatal 8975 1727204031.85701: done checking for any_errors_fatal 8975 1727204031.85702: checking for max_fail_percentage 8975 1727204031.85703: done checking for max_fail_percentage 8975 1727204031.85704: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.85705: done checking to see if all hosts have failed 8975 1727204031.85706: getting the remaining hosts for this loop 8975 1727204031.85707: done getting the remaining hosts for this loop 8975 1727204031.85710: getting the next task for host managed-node2 8975 1727204031.85715: done getting next task for host managed-node2 8975 1727204031.85717: ^ task is: TASK: meta (flush_handlers) 8975 1727204031.85718: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.85731: getting variables 8975 1727204031.85732: in VariableManager get_vars() 8975 1727204031.85743: Calling all_inventory to load vars for managed-node2 8975 1727204031.85746: Calling groups_inventory to load vars for managed-node2 8975 1727204031.85749: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.85756: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.85758: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.85761: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.86129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.86648: done with get_vars() 8975 1727204031.86661: done getting variables 8975 1727204031.86725: in VariableManager get_vars() 8975 1727204031.86738: Calling all_inventory to load vars for managed-node2 8975 1727204031.86741: Calling groups_inventory to load vars for managed-node2 8975 1727204031.86743: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.86864: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.86870: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.86875: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.87144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.87673: done with get_vars() 8975 1727204031.87690: done queuing things up, now waiting for results queue to drain 8975 1727204031.87692: results queue empty 8975 1727204031.87693: checking for any_errors_fatal 8975 1727204031.87695: done checking for any_errors_fatal 8975 1727204031.87696: checking for max_fail_percentage 8975 1727204031.87697: done checking for max_fail_percentage 8975 1727204031.87698: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.87698: done checking to see if all hosts have failed 8975 1727204031.87699: getting the remaining hosts for this loop 8975 1727204031.87700: done getting the remaining hosts for this loop 8975 1727204031.87703: getting the next task for host managed-node2 8975 1727204031.87707: done getting next task for host managed-node2 8975 1727204031.87708: ^ task is: None 8975 1727204031.87709: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.87710: done queuing things up, now waiting for results queue to drain 8975 1727204031.87711: results queue empty 8975 1727204031.87712: checking for any_errors_fatal 8975 1727204031.87713: done checking for any_errors_fatal 8975 1727204031.87714: checking for max_fail_percentage 8975 1727204031.87715: done checking for max_fail_percentage 8975 1727204031.87715: checking to see if all hosts have failed and the running result is not ok 8975 1727204031.87716: done checking to see if all hosts have failed 8975 1727204031.87718: getting the next task for host managed-node2 8975 1727204031.87723: done getting next task for host managed-node2 8975 1727204031.87724: ^ task is: None 8975 1727204031.87726: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.87895: in VariableManager get_vars() 8975 1727204031.87927: done with get_vars() 8975 1727204031.87935: in VariableManager get_vars() 8975 1727204031.88069: done with get_vars() 8975 1727204031.88077: variable 'omit' from source: magic vars 8975 1727204031.88113: in VariableManager get_vars() 8975 1727204031.88133: done with get_vars() 8975 1727204031.88159: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 8975 1727204031.90015: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 8975 1727204031.90143: getting the remaining hosts for this loop 8975 1727204031.90145: done getting the remaining hosts for this loop 8975 1727204031.90148: getting the next task for host managed-node2 8975 1727204031.90151: done getting next task for host managed-node2 8975 1727204031.90153: ^ task is: TASK: Gathering Facts 8975 1727204031.90155: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204031.90157: getting variables 8975 1727204031.90158: in VariableManager get_vars() 8975 1727204031.90178: Calling all_inventory to load vars for managed-node2 8975 1727204031.90181: Calling groups_inventory to load vars for managed-node2 8975 1727204031.90183: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204031.90190: Calling all_plugins_play to load vars for managed-node2 8975 1727204031.90206: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204031.90210: Calling groups_plugins_play to load vars for managed-node2 8975 1727204031.90693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204031.91156: done with get_vars() 8975 1727204031.91170: done getting variables 8975 1727204031.91268: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.104) 0:00:03.229 ***** 8975 1727204031.91296: entering _queue_task() for managed-node2/gather_facts 8975 1727204031.92010: worker is 1 (out of 1 available) 8975 1727204031.92027: exiting _queue_task() for managed-node2/gather_facts 8975 1727204031.92040: done queuing things up, now waiting for results queue to drain 8975 1727204031.92042: waiting for pending results... 8975 1727204031.92585: running TaskExecutor() for managed-node2/TASK: Gathering Facts 8975 1727204031.92590: in run() - task 127b8e07-fff9-9356-306d-000000000129 8975 1727204031.92595: variable 'ansible_search_path' from source: unknown 8975 1727204031.93173: calling self._execute() 8975 1727204031.93178: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.93181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.93184: variable 'omit' from source: magic vars 8975 1727204031.93734: variable 'ansible_distribution_major_version' from source: facts 8975 1727204031.94173: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204031.94179: variable 'omit' from source: magic vars 8975 1727204031.94182: variable 'omit' from source: magic vars 8975 1727204031.94185: variable 'omit' from source: magic vars 8975 1727204031.94188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204031.94194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204031.94383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204031.94402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.94423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204031.94463: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204031.94476: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.94485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.94972: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204031.94976: Set connection var ansible_connection to ssh 8975 1727204031.94978: Set connection var ansible_shell_executable to /bin/sh 8975 1727204031.94980: Set connection var ansible_timeout to 10 8975 1727204031.94982: Set connection var ansible_shell_type to sh 8975 1727204031.94984: Set connection var ansible_pipelining to False 8975 1727204031.94986: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.94988: variable 'ansible_connection' from source: unknown 8975 1727204031.94990: variable 'ansible_module_compression' from source: unknown 8975 1727204031.94992: variable 'ansible_shell_type' from source: unknown 8975 1727204031.94994: variable 'ansible_shell_executable' from source: unknown 8975 1727204031.94996: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204031.94998: variable 'ansible_pipelining' from source: unknown 8975 1727204031.95000: variable 'ansible_timeout' from source: unknown 8975 1727204031.95002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204031.95301: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204031.95322: variable 'omit' from source: magic vars 8975 1727204031.95332: starting attempt loop 8975 1727204031.95339: running the handler 8975 1727204031.95359: variable 'ansible_facts' from source: unknown 8975 1727204031.95734: _low_level_execute_command(): starting 8975 1727204031.95737: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204031.97191: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204031.97225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204031.97238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204031.97288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204031.97406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204031.99191: stdout chunk (state=3): >>>/root <<< 8975 1727204031.99287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204031.99505: stderr chunk (state=3): >>><<< 8975 1727204031.99515: stdout chunk (state=3): >>><<< 8975 1727204031.99546: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204031.99572: _low_level_execute_command(): starting 8975 1727204031.99681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307 `" && echo ansible-tmp-1727204031.9955513-9732-246054157459307="` echo /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307 `" ) && sleep 0' 8975 1727204032.01028: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204032.01033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.01036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204032.01047: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.01211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204032.01484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.01517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.03506: stdout chunk (state=3): >>>ansible-tmp-1727204031.9955513-9732-246054157459307=/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307 <<< 8975 1727204032.03687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204032.03774: stderr chunk (state=3): >>><<< 8975 1727204032.03784: stdout chunk (state=3): >>><<< 8975 1727204032.03810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204031.9955513-9732-246054157459307=/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204032.03850: variable 'ansible_module_compression' from source: unknown 8975 1727204032.04025: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8975 1727204032.04198: variable 'ansible_facts' from source: unknown 8975 1727204032.04601: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py 8975 1727204032.05133: Sending initial data 8975 1727204032.05137: Sent initial data (152 bytes) 8975 1727204032.06110: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204032.06381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204032.06605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.06674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.08258: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204032.08359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204032.08453: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp3d4sd3xn /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py <<< 8975 1727204032.08467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py" <<< 8975 1727204032.08557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp3d4sd3xn" to remote "/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py" <<< 8975 1727204032.11370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204032.11384: stdout chunk (state=3): >>><<< 8975 1727204032.11400: stderr chunk (state=3): >>><<< 8975 1727204032.11435: done transferring module to remote 8975 1727204032.11676: _low_level_execute_command(): starting 8975 1727204032.11680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/ /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py && sleep 0' 8975 1727204032.12853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204032.12980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204032.13209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.13274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.15102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204032.15186: stderr chunk (state=3): >>><<< 8975 1727204032.15196: stdout chunk (state=3): >>><<< 8975 1727204032.15219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204032.15237: _low_level_execute_command(): starting 8975 1727204032.15248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/AnsiballZ_setup.py && sleep 0' 8975 1727204032.15891: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204032.15908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204032.15924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204032.15946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204032.15964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204032.15980: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204032.15995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.16017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204032.16087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.16123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204032.16141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204032.16164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.16295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.81075: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3058, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 658, "free": 3058}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bio<<< 8975 1727204032.81107: stdout chunk (state=3): >>>s_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 378, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251327422464, "block_size": 4096, "block_total": 64479564, "block_available": 61359234, "block_used": 3120330, "inode_total": 16384000, "inode_available": 16301531, "inode_used": 82469, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_loadavg": {"1m": 0.482421875, "5m": 0.390625, "15m": 0.18310546875}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "52", "epoch": "1727204032", "epoch_int": "1727204032", "date": "2024-09-24", "time": "14:53:52", "iso8601_micro": "2024-09-24T18:53:52.806623Z", "iso8601": "2024-09-24T18:53:52Z", "iso8601_basic": "20240924T145352806623", "iso8601_basic_short": "20240924T145352", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8975 1727204032.83246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204032.83259: stdout chunk (state=3): >>><<< 8975 1727204032.83281: stderr chunk (state=3): >>><<< 8975 1727204032.83320: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3058, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 658, "free": 3058}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 378, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251327422464, "block_size": 4096, "block_total": 64479564, "block_available": 61359234, "block_used": 3120330, "inode_total": 16384000, "inode_available": 16301531, "inode_used": 82469, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_loadavg": {"1m": 0.482421875, "5m": 0.390625, "15m": 0.18310546875}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "52", "epoch": "1727204032", "epoch_int": "1727204032", "date": "2024-09-24", "time": "14:53:52", "iso8601_micro": "2024-09-24T18:53:52.806623Z", "iso8601": "2024-09-24T18:53:52Z", "iso8601_basic": "20240924T145352806623", "iso8601_basic_short": "20240924T145352", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204032.83685: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204032.83728: _low_level_execute_command(): starting 8975 1727204032.83738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204031.9955513-9732-246054157459307/ > /dev/null 2>&1 && sleep 0' 8975 1727204032.84441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204032.84459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204032.84477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204032.84498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204032.84515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204032.84540: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204032.84554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.84651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.84669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204032.84691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204032.84714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.84814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.86813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204032.86826: stdout chunk (state=3): >>><<< 8975 1727204032.86843: stderr chunk (state=3): >>><<< 8975 1727204032.86867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204032.86881: handler run complete 8975 1727204032.87026: variable 'ansible_facts' from source: unknown 8975 1727204032.87158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.87397: variable 'ansible_facts' from source: unknown 8975 1727204032.87454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.87541: attempt loop complete, returning result 8975 1727204032.87545: _execute() done 8975 1727204032.87548: dumping result to json 8975 1727204032.87566: done dumping result, returning 8975 1727204032.87576: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-9356-306d-000000000129] 8975 1727204032.87580: sending task result for task 127b8e07-fff9-9356-306d-000000000129 8975 1727204032.87838: done sending task result for task 127b8e07-fff9-9356-306d-000000000129 8975 1727204032.87841: WORKER PROCESS EXITING ok: [managed-node2] 8975 1727204032.88044: no more pending results, returning what we have 8975 1727204032.88046: results queue empty 8975 1727204032.88047: checking for any_errors_fatal 8975 1727204032.88048: done checking for any_errors_fatal 8975 1727204032.88048: checking for max_fail_percentage 8975 1727204032.88049: done checking for max_fail_percentage 8975 1727204032.88050: checking to see if all hosts have failed and the running result is not ok 8975 1727204032.88051: done checking to see if all hosts have failed 8975 1727204032.88051: getting the remaining hosts for this loop 8975 1727204032.88052: done getting the remaining hosts for this loop 8975 1727204032.88055: getting the next task for host managed-node2 8975 1727204032.88059: done getting next task for host managed-node2 8975 1727204032.88060: ^ task is: TASK: meta (flush_handlers) 8975 1727204032.88062: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204032.88064: getting variables 8975 1727204032.88067: in VariableManager get_vars() 8975 1727204032.88092: Calling all_inventory to load vars for managed-node2 8975 1727204032.88095: Calling groups_inventory to load vars for managed-node2 8975 1727204032.88097: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204032.88106: Calling all_plugins_play to load vars for managed-node2 8975 1727204032.88108: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204032.88110: Calling groups_plugins_play to load vars for managed-node2 8975 1727204032.88214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.88349: done with get_vars() 8975 1727204032.88356: done getting variables 8975 1727204032.88410: in VariableManager get_vars() 8975 1727204032.88426: Calling all_inventory to load vars for managed-node2 8975 1727204032.88428: Calling groups_inventory to load vars for managed-node2 8975 1727204032.88429: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204032.88433: Calling all_plugins_play to load vars for managed-node2 8975 1727204032.88434: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204032.88436: Calling groups_plugins_play to load vars for managed-node2 8975 1727204032.88523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.88641: done with get_vars() 8975 1727204032.88651: done queuing things up, now waiting for results queue to drain 8975 1727204032.88656: results queue empty 8975 1727204032.88657: checking for any_errors_fatal 8975 1727204032.88659: done checking for any_errors_fatal 8975 1727204032.88660: checking for max_fail_percentage 8975 1727204032.88661: done checking for max_fail_percentage 8975 1727204032.88661: checking to see if all hosts have failed and the running result is not ok 8975 1727204032.88662: done checking to see if all hosts have failed 8975 1727204032.88662: getting the remaining hosts for this loop 8975 1727204032.88663: done getting the remaining hosts for this loop 8975 1727204032.88665: getting the next task for host managed-node2 8975 1727204032.88669: done getting next task for host managed-node2 8975 1727204032.88671: ^ task is: TASK: INIT Prepare setup 8975 1727204032.88672: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204032.88674: getting variables 8975 1727204032.88674: in VariableManager get_vars() 8975 1727204032.88683: Calling all_inventory to load vars for managed-node2 8975 1727204032.88685: Calling groups_inventory to load vars for managed-node2 8975 1727204032.88686: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204032.88690: Calling all_plugins_play to load vars for managed-node2 8975 1727204032.88691: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204032.88693: Calling groups_plugins_play to load vars for managed-node2 8975 1727204032.88779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.88894: done with get_vars() 8975 1727204032.88901: done getting variables 8975 1727204032.88963: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Tuesday 24 September 2024 14:53:52 -0400 (0:00:00.976) 0:00:04.206 ***** 8975 1727204032.88987: entering _queue_task() for managed-node2/debug 8975 1727204032.88989: Creating lock for debug 8975 1727204032.89404: worker is 1 (out of 1 available) 8975 1727204032.89416: exiting _queue_task() for managed-node2/debug 8975 1727204032.89426: done queuing things up, now waiting for results queue to drain 8975 1727204032.89428: waiting for pending results... 8975 1727204032.89894: running TaskExecutor() for managed-node2/TASK: INIT Prepare setup 8975 1727204032.89899: in run() - task 127b8e07-fff9-9356-306d-00000000000b 8975 1727204032.89903: variable 'ansible_search_path' from source: unknown 8975 1727204032.89906: calling self._execute() 8975 1727204032.89908: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.89989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.89993: variable 'omit' from source: magic vars 8975 1727204032.90439: variable 'ansible_distribution_major_version' from source: facts 8975 1727204032.90449: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204032.90455: variable 'omit' from source: magic vars 8975 1727204032.90472: variable 'omit' from source: magic vars 8975 1727204032.90499: variable 'omit' from source: magic vars 8975 1727204032.90543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204032.90575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204032.90592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204032.90607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204032.90618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204032.90649: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204032.90653: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.90655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.90730: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204032.90734: Set connection var ansible_connection to ssh 8975 1727204032.90737: Set connection var ansible_shell_executable to /bin/sh 8975 1727204032.90749: Set connection var ansible_timeout to 10 8975 1727204032.90753: Set connection var ansible_shell_type to sh 8975 1727204032.90758: Set connection var ansible_pipelining to False 8975 1727204032.90780: variable 'ansible_shell_executable' from source: unknown 8975 1727204032.90783: variable 'ansible_connection' from source: unknown 8975 1727204032.90786: variable 'ansible_module_compression' from source: unknown 8975 1727204032.90788: variable 'ansible_shell_type' from source: unknown 8975 1727204032.90791: variable 'ansible_shell_executable' from source: unknown 8975 1727204032.90794: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.90796: variable 'ansible_pipelining' from source: unknown 8975 1727204032.90800: variable 'ansible_timeout' from source: unknown 8975 1727204032.90807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.90928: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204032.90935: variable 'omit' from source: magic vars 8975 1727204032.90940: starting attempt loop 8975 1727204032.90943: running the handler 8975 1727204032.90987: handler run complete 8975 1727204032.91005: attempt loop complete, returning result 8975 1727204032.91008: _execute() done 8975 1727204032.91011: dumping result to json 8975 1727204032.91014: done dumping result, returning 8975 1727204032.91024: done running TaskExecutor() for managed-node2/TASK: INIT Prepare setup [127b8e07-fff9-9356-306d-00000000000b] 8975 1727204032.91027: sending task result for task 127b8e07-fff9-9356-306d-00000000000b 8975 1727204032.91123: done sending task result for task 127b8e07-fff9-9356-306d-00000000000b 8975 1727204032.91126: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ################################################## 8975 1727204032.91182: no more pending results, returning what we have 8975 1727204032.91184: results queue empty 8975 1727204032.91185: checking for any_errors_fatal 8975 1727204032.91186: done checking for any_errors_fatal 8975 1727204032.91187: checking for max_fail_percentage 8975 1727204032.91189: done checking for max_fail_percentage 8975 1727204032.91189: checking to see if all hosts have failed and the running result is not ok 8975 1727204032.91191: done checking to see if all hosts have failed 8975 1727204032.91191: getting the remaining hosts for this loop 8975 1727204032.91193: done getting the remaining hosts for this loop 8975 1727204032.91197: getting the next task for host managed-node2 8975 1727204032.91207: done getting next task for host managed-node2 8975 1727204032.91210: ^ task is: TASK: Install dnsmasq 8975 1727204032.91213: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204032.91216: getting variables 8975 1727204032.91217: in VariableManager get_vars() 8975 1727204032.91334: Calling all_inventory to load vars for managed-node2 8975 1727204032.91337: Calling groups_inventory to load vars for managed-node2 8975 1727204032.91340: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204032.91353: Calling all_plugins_play to load vars for managed-node2 8975 1727204032.91355: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204032.91357: Calling groups_plugins_play to load vars for managed-node2 8975 1727204032.91467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204032.91593: done with get_vars() 8975 1727204032.91601: done getting variables 8975 1727204032.91648: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:53:52 -0400 (0:00:00.026) 0:00:04.233 ***** 8975 1727204032.91675: entering _queue_task() for managed-node2/package 8975 1727204032.91930: worker is 1 (out of 1 available) 8975 1727204032.91946: exiting _queue_task() for managed-node2/package 8975 1727204032.91959: done queuing things up, now waiting for results queue to drain 8975 1727204032.91960: waiting for pending results... 8975 1727204032.92119: running TaskExecutor() for managed-node2/TASK: Install dnsmasq 8975 1727204032.92199: in run() - task 127b8e07-fff9-9356-306d-00000000000f 8975 1727204032.92207: variable 'ansible_search_path' from source: unknown 8975 1727204032.92210: variable 'ansible_search_path' from source: unknown 8975 1727204032.92243: calling self._execute() 8975 1727204032.92316: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.92323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.92332: variable 'omit' from source: magic vars 8975 1727204032.92871: variable 'ansible_distribution_major_version' from source: facts 8975 1727204032.92875: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204032.92878: variable 'omit' from source: magic vars 8975 1727204032.92881: variable 'omit' from source: magic vars 8975 1727204032.92964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204032.95276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204032.95356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204032.95404: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204032.95457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204032.95491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204032.95606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204032.95642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204032.95681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204032.95730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204032.95751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204032.95872: variable '__network_is_ostree' from source: set_fact 8975 1727204032.95884: variable 'omit' from source: magic vars 8975 1727204032.95919: variable 'omit' from source: magic vars 8975 1727204032.95956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204032.95994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204032.96018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204032.96042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204032.96057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204032.96096: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204032.96103: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.96111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.96225: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204032.96233: Set connection var ansible_connection to ssh 8975 1727204032.96244: Set connection var ansible_shell_executable to /bin/sh 8975 1727204032.96255: Set connection var ansible_timeout to 10 8975 1727204032.96261: Set connection var ansible_shell_type to sh 8975 1727204032.96281: Set connection var ansible_pipelining to False 8975 1727204032.96311: variable 'ansible_shell_executable' from source: unknown 8975 1727204032.96320: variable 'ansible_connection' from source: unknown 8975 1727204032.96328: variable 'ansible_module_compression' from source: unknown 8975 1727204032.96334: variable 'ansible_shell_type' from source: unknown 8975 1727204032.96340: variable 'ansible_shell_executable' from source: unknown 8975 1727204032.96347: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204032.96354: variable 'ansible_pipelining' from source: unknown 8975 1727204032.96472: variable 'ansible_timeout' from source: unknown 8975 1727204032.96475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204032.96486: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204032.96503: variable 'omit' from source: magic vars 8975 1727204032.96513: starting attempt loop 8975 1727204032.96520: running the handler 8975 1727204032.96530: variable 'ansible_facts' from source: unknown 8975 1727204032.96537: variable 'ansible_facts' from source: unknown 8975 1727204032.96578: _low_level_execute_command(): starting 8975 1727204032.96590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204032.97318: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204032.97336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204032.97350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204032.97372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204032.97391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204032.97403: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204032.97418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204032.97438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204032.97450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204032.97461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204032.97484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204032.97500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204032.97589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204032.97619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204032.97719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204032.99414: stdout chunk (state=3): >>>/root <<< 8975 1727204032.99625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204032.99629: stdout chunk (state=3): >>><<< 8975 1727204032.99631: stderr chunk (state=3): >>><<< 8975 1727204032.99661: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204032.99683: _low_level_execute_command(): starting 8975 1727204032.99694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742 `" && echo ansible-tmp-1727204032.996709-9766-29785703933742="` echo /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742 `" ) && sleep 0' 8975 1727204033.00354: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204033.00375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204033.00390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.00436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204033.00449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.00478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.00552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204033.00571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204033.00594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204033.00693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204033.08491: stdout chunk (state=3): >>>ansible-tmp-1727204032.996709-9766-29785703933742=/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742 <<< 8975 1727204033.08588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204033.08652: stderr chunk (state=3): >>><<< 8975 1727204033.08656: stdout chunk (state=3): >>><<< 8975 1727204033.08675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204032.996709-9766-29785703933742=/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204033.08711: variable 'ansible_module_compression' from source: unknown 8975 1727204033.08763: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8975 1727204033.08769: ANSIBALLZ: Acquiring lock 8975 1727204033.08772: ANSIBALLZ: Lock acquired: 140501807209920 8975 1727204033.08774: ANSIBALLZ: Creating module 8975 1727204033.25656: ANSIBALLZ: Writing module into payload 8975 1727204033.25804: ANSIBALLZ: Writing module 8975 1727204033.25831: ANSIBALLZ: Renaming module 8975 1727204033.25835: ANSIBALLZ: Done creating module 8975 1727204033.25856: variable 'ansible_facts' from source: unknown 8975 1727204033.25927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py 8975 1727204033.26044: Sending initial data 8975 1727204033.26047: Sent initial data (148 bytes) 8975 1727204033.26536: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.26545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.26562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.26569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.26626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204033.26644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204033.26712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204033.28369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204033.28444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204033.28515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp_d0g7bno /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py <<< 8975 1727204033.28522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py" <<< 8975 1727204033.28584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp_d0g7bno" to remote "/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py" <<< 8975 1727204033.28587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py" <<< 8975 1727204033.29387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204033.29469: stderr chunk (state=3): >>><<< 8975 1727204033.29473: stdout chunk (state=3): >>><<< 8975 1727204033.29494: done transferring module to remote 8975 1727204033.29506: _low_level_execute_command(): starting 8975 1727204033.29510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/ /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py && sleep 0' 8975 1727204033.30013: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204033.30017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.30020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.30022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.30072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204033.30092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204033.30095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204033.30159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204033.32000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204033.32063: stderr chunk (state=3): >>><<< 8975 1727204033.32069: stdout chunk (state=3): >>><<< 8975 1727204033.32085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204033.32089: _low_level_execute_command(): starting 8975 1727204033.32093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/AnsiballZ_dnf.py && sleep 0' 8975 1727204033.32605: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204033.32609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204033.32612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204033.32614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204033.32669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204033.32673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204033.32679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204033.32752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.23155: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc40.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8975 1727204036.29095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204036.29157: stderr chunk (state=3): >>><<< 8975 1727204036.29161: stdout chunk (state=3): >>><<< 8975 1727204036.29184: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc40.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204036.29221: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204036.29230: _low_level_execute_command(): starting 8975 1727204036.29235: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204032.996709-9766-29785703933742/ > /dev/null 2>&1 && sleep 0' 8975 1727204036.29748: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204036.29751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204036.29754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204036.29757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204036.29811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204036.29815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.29827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.29903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.31977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204036.32034: stderr chunk (state=3): >>><<< 8975 1727204036.32038: stdout chunk (state=3): >>><<< 8975 1727204036.32051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204036.32079: handler run complete 8975 1727204036.32202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204036.32340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204036.32387: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204036.32412: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204036.32437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204036.32499: variable '__install_status' from source: unknown 8975 1727204036.32516: Evaluated conditional (__install_status is success): True 8975 1727204036.32531: attempt loop complete, returning result 8975 1727204036.32534: _execute() done 8975 1727204036.32536: dumping result to json 8975 1727204036.32543: done dumping result, returning 8975 1727204036.32556: done running TaskExecutor() for managed-node2/TASK: Install dnsmasq [127b8e07-fff9-9356-306d-00000000000f] 8975 1727204036.32571: sending task result for task 127b8e07-fff9-9356-306d-00000000000f 8975 1727204036.32672: done sending task result for task 127b8e07-fff9-9356-306d-00000000000f 8975 1727204036.32675: WORKER PROCESS EXITING changed: [managed-node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-1.fc40.x86_64" ] } 8975 1727204036.32795: no more pending results, returning what we have 8975 1727204036.32798: results queue empty 8975 1727204036.32799: checking for any_errors_fatal 8975 1727204036.32807: done checking for any_errors_fatal 8975 1727204036.32808: checking for max_fail_percentage 8975 1727204036.32809: done checking for max_fail_percentage 8975 1727204036.32810: checking to see if all hosts have failed and the running result is not ok 8975 1727204036.32811: done checking to see if all hosts have failed 8975 1727204036.32812: getting the remaining hosts for this loop 8975 1727204036.32814: done getting the remaining hosts for this loop 8975 1727204036.32817: getting the next task for host managed-node2 8975 1727204036.32824: done getting next task for host managed-node2 8975 1727204036.32826: ^ task is: TASK: Install pgrep, sysctl 8975 1727204036.32828: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204036.32832: getting variables 8975 1727204036.32833: in VariableManager get_vars() 8975 1727204036.32880: Calling all_inventory to load vars for managed-node2 8975 1727204036.32883: Calling groups_inventory to load vars for managed-node2 8975 1727204036.32885: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204036.32896: Calling all_plugins_play to load vars for managed-node2 8975 1727204036.32899: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204036.32902: Calling groups_plugins_play to load vars for managed-node2 8975 1727204036.33075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204036.33230: done with get_vars() 8975 1727204036.33238: done getting variables 8975 1727204036.33286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:53:56 -0400 (0:00:03.416) 0:00:07.650 ***** 8975 1727204036.33315: entering _queue_task() for managed-node2/package 8975 1727204036.33624: worker is 1 (out of 1 available) 8975 1727204036.33639: exiting _queue_task() for managed-node2/package 8975 1727204036.33654: done queuing things up, now waiting for results queue to drain 8975 1727204036.33656: waiting for pending results... 8975 1727204036.33987: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 8975 1727204036.33995: in run() - task 127b8e07-fff9-9356-306d-000000000010 8975 1727204036.34020: variable 'ansible_search_path' from source: unknown 8975 1727204036.34031: variable 'ansible_search_path' from source: unknown 8975 1727204036.34085: calling self._execute() 8975 1727204036.34273: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204036.34277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204036.34279: variable 'omit' from source: magic vars 8975 1727204036.34680: variable 'ansible_distribution_major_version' from source: facts 8975 1727204036.34699: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204036.34844: variable 'ansible_os_family' from source: facts 8975 1727204036.34857: Evaluated conditional (ansible_os_family == 'RedHat'): True 8975 1727204036.35082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204036.35563: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204036.35600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204036.35648: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204036.35691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204036.35802: variable 'ansible_distribution_major_version' from source: facts 8975 1727204036.35826: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 8975 1727204036.35835: when evaluation is False, skipping this task 8975 1727204036.35842: _execute() done 8975 1727204036.35849: dumping result to json 8975 1727204036.35856: done dumping result, returning 8975 1727204036.35869: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [127b8e07-fff9-9356-306d-000000000010] 8975 1727204036.35880: sending task result for task 127b8e07-fff9-9356-306d-000000000010 8975 1727204036.36111: done sending task result for task 127b8e07-fff9-9356-306d-000000000010 8975 1727204036.36115: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 8975 1727204036.36176: no more pending results, returning what we have 8975 1727204036.36179: results queue empty 8975 1727204036.36180: checking for any_errors_fatal 8975 1727204036.36262: done checking for any_errors_fatal 8975 1727204036.36264: checking for max_fail_percentage 8975 1727204036.36270: done checking for max_fail_percentage 8975 1727204036.36271: checking to see if all hosts have failed and the running result is not ok 8975 1727204036.36273: done checking to see if all hosts have failed 8975 1727204036.36274: getting the remaining hosts for this loop 8975 1727204036.36276: done getting the remaining hosts for this loop 8975 1727204036.36281: getting the next task for host managed-node2 8975 1727204036.36289: done getting next task for host managed-node2 8975 1727204036.36292: ^ task is: TASK: Install pgrep, sysctl 8975 1727204036.36411: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204036.36416: getting variables 8975 1727204036.36418: in VariableManager get_vars() 8975 1727204036.36460: Calling all_inventory to load vars for managed-node2 8975 1727204036.36463: Calling groups_inventory to load vars for managed-node2 8975 1727204036.36521: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204036.36533: Calling all_plugins_play to load vars for managed-node2 8975 1727204036.36536: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204036.36539: Calling groups_plugins_play to load vars for managed-node2 8975 1727204036.36990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204036.37252: done with get_vars() 8975 1727204036.37269: done getting variables 8975 1727204036.37342: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.040) 0:00:07.691 ***** 8975 1727204036.37412: entering _queue_task() for managed-node2/package 8975 1727204036.37888: worker is 1 (out of 1 available) 8975 1727204036.37902: exiting _queue_task() for managed-node2/package 8975 1727204036.37916: done queuing things up, now waiting for results queue to drain 8975 1727204036.37918: waiting for pending results... 8975 1727204036.38451: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 8975 1727204036.38457: in run() - task 127b8e07-fff9-9356-306d-000000000011 8975 1727204036.38459: variable 'ansible_search_path' from source: unknown 8975 1727204036.38461: variable 'ansible_search_path' from source: unknown 8975 1727204036.38464: calling self._execute() 8975 1727204036.38550: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204036.38554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204036.38558: variable 'omit' from source: magic vars 8975 1727204036.39021: variable 'ansible_distribution_major_version' from source: facts 8975 1727204036.39028: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204036.39374: variable 'ansible_os_family' from source: facts 8975 1727204036.39381: Evaluated conditional (ansible_os_family == 'RedHat'): True 8975 1727204036.39626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204036.40557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204036.40700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204036.40908: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204036.40969: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204036.41049: variable 'ansible_distribution_major_version' from source: facts 8975 1727204036.41150: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 8975 1727204036.41153: variable 'omit' from source: magic vars 8975 1727204036.41247: variable 'omit' from source: magic vars 8975 1727204036.41680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204036.45463: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204036.45675: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204036.45910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204036.45914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204036.45917: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204036.45933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204036.45970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204036.46017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204036.46057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204036.46237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204036.46241: variable '__network_is_ostree' from source: set_fact 8975 1727204036.46243: variable 'omit' from source: magic vars 8975 1727204036.46621: variable 'omit' from source: magic vars 8975 1727204036.46649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204036.46679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204036.46719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204036.46736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204036.46844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204036.46967: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204036.46973: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204036.46982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204036.47389: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204036.47392: Set connection var ansible_connection to ssh 8975 1727204036.47395: Set connection var ansible_shell_executable to /bin/sh 8975 1727204036.47397: Set connection var ansible_timeout to 10 8975 1727204036.47443: Set connection var ansible_shell_type to sh 8975 1727204036.47452: Set connection var ansible_pipelining to False 8975 1727204036.47456: variable 'ansible_shell_executable' from source: unknown 8975 1727204036.47458: variable 'ansible_connection' from source: unknown 8975 1727204036.47460: variable 'ansible_module_compression' from source: unknown 8975 1727204036.47462: variable 'ansible_shell_type' from source: unknown 8975 1727204036.47464: variable 'ansible_shell_executable' from source: unknown 8975 1727204036.47472: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204036.47478: variable 'ansible_pipelining' from source: unknown 8975 1727204036.47483: variable 'ansible_timeout' from source: unknown 8975 1727204036.47486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204036.47678: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204036.47682: variable 'omit' from source: magic vars 8975 1727204036.47684: starting attempt loop 8975 1727204036.47687: running the handler 8975 1727204036.47689: variable 'ansible_facts' from source: unknown 8975 1727204036.47771: variable 'ansible_facts' from source: unknown 8975 1727204036.47775: _low_level_execute_command(): starting 8975 1727204036.47778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204036.48546: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204036.48561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204036.48616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204036.48620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.48670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.48744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.50458: stdout chunk (state=3): >>>/root <<< 8975 1727204036.50569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204036.50648: stderr chunk (state=3): >>><<< 8975 1727204036.50652: stdout chunk (state=3): >>><<< 8975 1727204036.50682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204036.50703: _low_level_execute_command(): starting 8975 1727204036.50707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834 `" && echo ansible-tmp-1727204036.5068226-10057-161366880684834="` echo /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834 `" ) && sleep 0' 8975 1727204036.51410: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.51456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.51534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.53512: stdout chunk (state=3): >>>ansible-tmp-1727204036.5068226-10057-161366880684834=/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834 <<< 8975 1727204036.53639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204036.53722: stderr chunk (state=3): >>><<< 8975 1727204036.53727: stdout chunk (state=3): >>><<< 8975 1727204036.53751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204036.5068226-10057-161366880684834=/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204036.53783: variable 'ansible_module_compression' from source: unknown 8975 1727204036.53851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 8975 1727204036.53909: variable 'ansible_facts' from source: unknown 8975 1727204036.54030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py 8975 1727204036.54289: Sending initial data 8975 1727204036.54352: Sent initial data (151 bytes) 8975 1727204036.55474: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.55479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.55530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.57199: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8975 1727204036.57237: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204036.57339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204036.57413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp9wcdypjl /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py <<< 8975 1727204036.57429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py" <<< 8975 1727204036.57531: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp9wcdypjl" to remote "/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py" <<< 8975 1727204036.58659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204036.58802: stderr chunk (state=3): >>><<< 8975 1727204036.58806: stdout chunk (state=3): >>><<< 8975 1727204036.58809: done transferring module to remote 8975 1727204036.58811: _low_level_execute_command(): starting 8975 1727204036.58865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/ /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py && sleep 0' 8975 1727204036.59471: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204036.59481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204036.59519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204036.59526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204036.59529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204036.59532: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204036.59534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204036.59597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204036.59601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204036.59604: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204036.59607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204036.59609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204036.59611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204036.59614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204036.59616: stderr chunk (state=3): >>>debug2: match found <<< 8975 1727204036.59635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204036.59688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204036.59707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.59710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.59819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204036.61803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204036.61807: stdout chunk (state=3): >>><<< 8975 1727204036.61810: stderr chunk (state=3): >>><<< 8975 1727204036.61973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204036.61977: _low_level_execute_command(): starting 8975 1727204036.61980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/AnsiballZ_dnf.py && sleep 0' 8975 1727204036.62571: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204036.62594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204036.62597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204036.62708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204037.69669: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 8975 1727204037.74031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204037.74389: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 8975 1727204037.74393: stdout chunk (state=3): >>><<< 8975 1727204037.74401: stderr chunk (state=3): >>><<< 8975 1727204037.74404: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204037.74416: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204037.74425: _low_level_execute_command(): starting 8975 1727204037.74427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204036.5068226-10057-161366880684834/ > /dev/null 2>&1 && sleep 0' 8975 1727204037.76505: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204037.76555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204037.76654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204037.76682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204037.76838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204037.78905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204037.79068: stderr chunk (state=3): >>><<< 8975 1727204037.79074: stdout chunk (state=3): >>><<< 8975 1727204037.79495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204037.79500: handler run complete 8975 1727204037.79503: attempt loop complete, returning result 8975 1727204037.79505: _execute() done 8975 1727204037.79507: dumping result to json 8975 1727204037.79509: done dumping result, returning 8975 1727204037.79512: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [127b8e07-fff9-9356-306d-000000000011] 8975 1727204037.79514: sending task result for task 127b8e07-fff9-9356-306d-000000000011 8975 1727204037.79621: done sending task result for task 127b8e07-fff9-9356-306d-000000000011 8975 1727204037.79625: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8975 1727204037.79777: no more pending results, returning what we have 8975 1727204037.79782: results queue empty 8975 1727204037.79782: checking for any_errors_fatal 8975 1727204037.79791: done checking for any_errors_fatal 8975 1727204037.79792: checking for max_fail_percentage 8975 1727204037.79794: done checking for max_fail_percentage 8975 1727204037.79795: checking to see if all hosts have failed and the running result is not ok 8975 1727204037.79796: done checking to see if all hosts have failed 8975 1727204037.79797: getting the remaining hosts for this loop 8975 1727204037.79799: done getting the remaining hosts for this loop 8975 1727204037.79804: getting the next task for host managed-node2 8975 1727204037.79819: done getting next task for host managed-node2 8975 1727204037.79824: ^ task is: TASK: Create test interfaces 8975 1727204037.79829: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204037.79833: getting variables 8975 1727204037.79836: in VariableManager get_vars() 8975 1727204037.80364: Calling all_inventory to load vars for managed-node2 8975 1727204037.80371: Calling groups_inventory to load vars for managed-node2 8975 1727204037.80373: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204037.80388: Calling all_plugins_play to load vars for managed-node2 8975 1727204037.80392: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204037.80396: Calling groups_plugins_play to load vars for managed-node2 8975 1727204037.81288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204037.81545: done with get_vars() 8975 1727204037.81562: done getting variables 8975 1727204037.81678: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:53:57 -0400 (0:00:01.442) 0:00:09.134 ***** 8975 1727204037.81711: entering _queue_task() for managed-node2/shell 8975 1727204037.81713: Creating lock for shell 8975 1727204037.82111: worker is 1 (out of 1 available) 8975 1727204037.82126: exiting _queue_task() for managed-node2/shell 8975 1727204037.82139: done queuing things up, now waiting for results queue to drain 8975 1727204037.82140: waiting for pending results... 8975 1727204037.82484: running TaskExecutor() for managed-node2/TASK: Create test interfaces 8975 1727204037.82873: in run() - task 127b8e07-fff9-9356-306d-000000000012 8975 1727204037.82877: variable 'ansible_search_path' from source: unknown 8975 1727204037.82880: variable 'ansible_search_path' from source: unknown 8975 1727204037.82883: calling self._execute() 8975 1727204037.82940: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204037.82954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204037.82972: variable 'omit' from source: magic vars 8975 1727204037.83493: variable 'ansible_distribution_major_version' from source: facts 8975 1727204037.83512: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204037.83529: variable 'omit' from source: magic vars 8975 1727204037.83585: variable 'omit' from source: magic vars 8975 1727204037.84256: variable 'dhcp_interface1' from source: play vars 8975 1727204037.84575: variable 'dhcp_interface2' from source: play vars 8975 1727204037.84579: variable 'omit' from source: magic vars 8975 1727204037.84597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204037.84658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204037.84706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204037.84825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204037.84847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204037.84918: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204037.84930: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204037.85015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204037.85189: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204037.85197: Set connection var ansible_connection to ssh 8975 1727204037.85208: Set connection var ansible_shell_executable to /bin/sh 8975 1727204037.85220: Set connection var ansible_timeout to 10 8975 1727204037.85229: Set connection var ansible_shell_type to sh 8975 1727204037.85245: Set connection var ansible_pipelining to False 8975 1727204037.85279: variable 'ansible_shell_executable' from source: unknown 8975 1727204037.85287: variable 'ansible_connection' from source: unknown 8975 1727204037.85294: variable 'ansible_module_compression' from source: unknown 8975 1727204037.85300: variable 'ansible_shell_type' from source: unknown 8975 1727204037.85306: variable 'ansible_shell_executable' from source: unknown 8975 1727204037.85312: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204037.85320: variable 'ansible_pipelining' from source: unknown 8975 1727204037.85326: variable 'ansible_timeout' from source: unknown 8975 1727204037.85374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204037.85513: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204037.85529: variable 'omit' from source: magic vars 8975 1727204037.85539: starting attempt loop 8975 1727204037.85545: running the handler 8975 1727204037.85559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204037.85588: _low_level_execute_command(): starting 8975 1727204037.85673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204037.86441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204037.86467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204037.86490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204037.86560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204037.86621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204037.86639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204037.86671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204037.86772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204037.88496: stdout chunk (state=3): >>>/root <<< 8975 1727204037.88709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204037.88713: stdout chunk (state=3): >>><<< 8975 1727204037.88716: stderr chunk (state=3): >>><<< 8975 1727204037.88845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204037.88849: _low_level_execute_command(): starting 8975 1727204037.88852: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628 `" && echo ansible-tmp-1727204037.887432-10320-19266892361628="` echo /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628 `" ) && sleep 0' 8975 1727204037.89694: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204037.89836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204037.89857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204037.89883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204037.89975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204037.91996: stdout chunk (state=3): >>>ansible-tmp-1727204037.887432-10320-19266892361628=/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628 <<< 8975 1727204037.92294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204037.92588: stderr chunk (state=3): >>><<< 8975 1727204037.92600: stdout chunk (state=3): >>><<< 8975 1727204037.92646: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204037.887432-10320-19266892361628=/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204037.92771: variable 'ansible_module_compression' from source: unknown 8975 1727204037.92954: ANSIBALLZ: Using generic lock for ansible.legacy.command 8975 1727204037.92958: ANSIBALLZ: Acquiring lock 8975 1727204037.92961: ANSIBALLZ: Lock acquired: 140501807209920 8975 1727204037.92963: ANSIBALLZ: Creating module 8975 1727204038.16868: ANSIBALLZ: Writing module into payload 8975 1727204038.16986: ANSIBALLZ: Writing module 8975 1727204038.17027: ANSIBALLZ: Renaming module 8975 1727204038.17041: ANSIBALLZ: Done creating module 8975 1727204038.17068: variable 'ansible_facts' from source: unknown 8975 1727204038.17148: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py 8975 1727204038.17349: Sending initial data 8975 1727204038.17359: Sent initial data (153 bytes) 8975 1727204038.18493: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204038.18547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204038.18571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204038.18652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204038.18718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204038.18756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204038.18777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204038.18982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204038.20544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204038.20615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204038.20686: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpv_p3knwv /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py <<< 8975 1727204038.20696: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py" <<< 8975 1727204038.21003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpv_p3knwv" to remote "/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py" <<< 8975 1727204038.22374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204038.22580: stderr chunk (state=3): >>><<< 8975 1727204038.22591: stdout chunk (state=3): >>><<< 8975 1727204038.22624: done transferring module to remote 8975 1727204038.22973: _low_level_execute_command(): starting 8975 1727204038.22977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/ /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py && sleep 0' 8975 1727204038.24078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204038.24103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204038.24117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204038.24253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204038.24344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204038.24441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204038.24487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204038.24552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204038.26707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204038.26712: stderr chunk (state=3): >>><<< 8975 1727204038.26715: stdout chunk (state=3): >>><<< 8975 1727204038.26718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204038.26720: _low_level_execute_command(): starting 8975 1727204038.26723: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/AnsiballZ_command.py && sleep 0' 8975 1727204038.27928: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204038.27953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204038.27972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204038.28074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204038.28174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204038.28248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204038.28351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204038.28461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204039.73940: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3396 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3396 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:53:58.447388", "end": "2024-09-24 14:53:59.737593", "delta": "0:00:01.290205", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204039.75669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204039.75683: stdout chunk (state=3): >>><<< 8975 1727204039.75699: stderr chunk (state=3): >>><<< 8975 1727204039.76052: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3396 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3396 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:53:58.447388", "end": "2024-09-24 14:53:59.737593", "delta": "0:00:01.290205", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204039.76062: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204039.76067: _low_level_execute_command(): starting 8975 1727204039.76070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204037.887432-10320-19266892361628/ > /dev/null 2>&1 && sleep 0' 8975 1727204039.77381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204039.77385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204039.77387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204039.77389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204039.77392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204039.77394: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204039.77396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204039.77442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204039.77462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204039.77477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204039.77486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204039.77643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204039.79626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204039.79752: stderr chunk (state=3): >>><<< 8975 1727204039.79755: stdout chunk (state=3): >>><<< 8975 1727204039.79776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204039.79784: handler run complete 8975 1727204039.79811: Evaluated conditional (False): False 8975 1727204039.79880: attempt loop complete, returning result 8975 1727204039.79883: _execute() done 8975 1727204039.79886: dumping result to json 8975 1727204039.79918: done dumping result, returning 8975 1727204039.79921: done running TaskExecutor() for managed-node2/TASK: Create test interfaces [127b8e07-fff9-9356-306d-000000000012] 8975 1727204039.79924: sending task result for task 127b8e07-fff9-9356-306d-000000000012 8975 1727204039.80232: done sending task result for task 127b8e07-fff9-9356-306d-000000000012 8975 1727204039.80581: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.290205", "end": "2024-09-24 14:53:59.737593", "rc": 0, "start": "2024-09-24 14:53:58.447388" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 3396 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 3396 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 8975 1727204039.80605: no more pending results, returning what we have 8975 1727204039.80608: results queue empty 8975 1727204039.80609: checking for any_errors_fatal 8975 1727204039.80616: done checking for any_errors_fatal 8975 1727204039.80617: checking for max_fail_percentage 8975 1727204039.80619: done checking for max_fail_percentage 8975 1727204039.80620: checking to see if all hosts have failed and the running result is not ok 8975 1727204039.80621: done checking to see if all hosts have failed 8975 1727204039.80622: getting the remaining hosts for this loop 8975 1727204039.80626: done getting the remaining hosts for this loop 8975 1727204039.80630: getting the next task for host managed-node2 8975 1727204039.80638: done getting next task for host managed-node2 8975 1727204039.80641: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8975 1727204039.80644: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204039.80647: getting variables 8975 1727204039.80649: in VariableManager get_vars() 8975 1727204039.80690: Calling all_inventory to load vars for managed-node2 8975 1727204039.80693: Calling groups_inventory to load vars for managed-node2 8975 1727204039.80695: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204039.80707: Calling all_plugins_play to load vars for managed-node2 8975 1727204039.80710: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204039.80713: Calling groups_plugins_play to load vars for managed-node2 8975 1727204039.81334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204039.81547: done with get_vars() 8975 1727204039.81561: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:53:59 -0400 (0:00:02.001) 0:00:11.135 ***** 8975 1727204039.81974: entering _queue_task() for managed-node2/include_tasks 8975 1727204039.82645: worker is 1 (out of 1 available) 8975 1727204039.82658: exiting _queue_task() for managed-node2/include_tasks 8975 1727204039.82875: done queuing things up, now waiting for results queue to drain 8975 1727204039.82877: waiting for pending results... 8975 1727204039.83241: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 8975 1727204039.83511: in run() - task 127b8e07-fff9-9356-306d-000000000016 8975 1727204039.83674: variable 'ansible_search_path' from source: unknown 8975 1727204039.83678: variable 'ansible_search_path' from source: unknown 8975 1727204039.83681: calling self._execute() 8975 1727204039.83893: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204039.83976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204039.83992: variable 'omit' from source: magic vars 8975 1727204039.84943: variable 'ansible_distribution_major_version' from source: facts 8975 1727204039.84963: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204039.84985: _execute() done 8975 1727204039.85041: dumping result to json 8975 1727204039.85050: done dumping result, returning 8975 1727204039.85062: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-9356-306d-000000000016] 8975 1727204039.85147: sending task result for task 127b8e07-fff9-9356-306d-000000000016 8975 1727204039.85421: no more pending results, returning what we have 8975 1727204039.85430: in VariableManager get_vars() 8975 1727204039.85489: Calling all_inventory to load vars for managed-node2 8975 1727204039.85493: Calling groups_inventory to load vars for managed-node2 8975 1727204039.85495: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204039.85512: Calling all_plugins_play to load vars for managed-node2 8975 1727204039.85516: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204039.85520: Calling groups_plugins_play to load vars for managed-node2 8975 1727204039.86403: done sending task result for task 127b8e07-fff9-9356-306d-000000000016 8975 1727204039.86407: WORKER PROCESS EXITING 8975 1727204039.86433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204039.86860: done with get_vars() 8975 1727204039.86872: variable 'ansible_search_path' from source: unknown 8975 1727204039.86873: variable 'ansible_search_path' from source: unknown 8975 1727204039.86912: we have included files to process 8975 1727204039.86913: generating all_blocks data 8975 1727204039.86915: done generating all_blocks data 8975 1727204039.86915: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204039.86916: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204039.86918: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204039.87251: done processing included file 8975 1727204039.87253: iterating over new_blocks loaded from include file 8975 1727204039.87255: in VariableManager get_vars() 8975 1727204039.87280: done with get_vars() 8975 1727204039.87282: filtering new block on tags 8975 1727204039.87300: done filtering new block on tags 8975 1727204039.87303: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 8975 1727204039.87308: extending task lists for all hosts with included blocks 8975 1727204039.87427: done extending task lists 8975 1727204039.87429: done processing included files 8975 1727204039.87430: results queue empty 8975 1727204039.87430: checking for any_errors_fatal 8975 1727204039.87437: done checking for any_errors_fatal 8975 1727204039.87438: checking for max_fail_percentage 8975 1727204039.87439: done checking for max_fail_percentage 8975 1727204039.87440: checking to see if all hosts have failed and the running result is not ok 8975 1727204039.87441: done checking to see if all hosts have failed 8975 1727204039.87441: getting the remaining hosts for this loop 8975 1727204039.87443: done getting the remaining hosts for this loop 8975 1727204039.87445: getting the next task for host managed-node2 8975 1727204039.87450: done getting next task for host managed-node2 8975 1727204039.87452: ^ task is: TASK: Get stat for interface {{ interface }} 8975 1727204039.87454: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204039.87457: getting variables 8975 1727204039.87458: in VariableManager get_vars() 8975 1727204039.87474: Calling all_inventory to load vars for managed-node2 8975 1727204039.87477: Calling groups_inventory to load vars for managed-node2 8975 1727204039.87479: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204039.87486: Calling all_plugins_play to load vars for managed-node2 8975 1727204039.87488: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204039.87491: Calling groups_plugins_play to load vars for managed-node2 8975 1727204039.87644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204039.87848: done with get_vars() 8975 1727204039.87858: done getting variables 8975 1727204039.88072: variable 'interface' from source: task vars 8975 1727204039.88078: variable 'dhcp_interface1' from source: play vars 8975 1727204039.88190: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:59 -0400 (0:00:00.063) 0:00:11.199 ***** 8975 1727204039.88238: entering _queue_task() for managed-node2/stat 8975 1727204039.88893: worker is 1 (out of 1 available) 8975 1727204039.88902: exiting _queue_task() for managed-node2/stat 8975 1727204039.88914: done queuing things up, now waiting for results queue to drain 8975 1727204039.88915: waiting for pending results... 8975 1727204039.88979: running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 8975 1727204039.89138: in run() - task 127b8e07-fff9-9356-306d-000000000153 8975 1727204039.89161: variable 'ansible_search_path' from source: unknown 8975 1727204039.89172: variable 'ansible_search_path' from source: unknown 8975 1727204039.89220: calling self._execute() 8975 1727204039.89322: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204039.89338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204039.89354: variable 'omit' from source: magic vars 8975 1727204039.89838: variable 'ansible_distribution_major_version' from source: facts 8975 1727204039.89855: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204039.89870: variable 'omit' from source: magic vars 8975 1727204039.89970: variable 'omit' from source: magic vars 8975 1727204039.90060: variable 'interface' from source: task vars 8975 1727204039.90074: variable 'dhcp_interface1' from source: play vars 8975 1727204039.90152: variable 'dhcp_interface1' from source: play vars 8975 1727204039.90181: variable 'omit' from source: magic vars 8975 1727204039.90343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204039.90346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204039.90348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204039.90350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204039.90352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204039.90384: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204039.90391: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204039.90397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204039.90555: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204039.90574: Set connection var ansible_connection to ssh 8975 1727204039.90606: Set connection var ansible_shell_executable to /bin/sh 8975 1727204039.90616: Set connection var ansible_timeout to 10 8975 1727204039.90622: Set connection var ansible_shell_type to sh 8975 1727204039.90640: Set connection var ansible_pipelining to False 8975 1727204039.90916: variable 'ansible_shell_executable' from source: unknown 8975 1727204039.90920: variable 'ansible_connection' from source: unknown 8975 1727204039.90922: variable 'ansible_module_compression' from source: unknown 8975 1727204039.90928: variable 'ansible_shell_type' from source: unknown 8975 1727204039.90930: variable 'ansible_shell_executable' from source: unknown 8975 1727204039.90932: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204039.90934: variable 'ansible_pipelining' from source: unknown 8975 1727204039.90937: variable 'ansible_timeout' from source: unknown 8975 1727204039.90939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204039.91188: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204039.91207: variable 'omit' from source: magic vars 8975 1727204039.91222: starting attempt loop 8975 1727204039.91234: running the handler 8975 1727204039.91254: _low_level_execute_command(): starting 8975 1727204039.91268: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204039.92087: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204039.92145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204039.92234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204039.92255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204039.92433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204039.94090: stdout chunk (state=3): >>>/root <<< 8975 1727204039.94299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204039.94303: stdout chunk (state=3): >>><<< 8975 1727204039.94305: stderr chunk (state=3): >>><<< 8975 1727204039.94330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204039.94445: _low_level_execute_command(): starting 8975 1727204039.94449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699 `" && echo ansible-tmp-1727204039.9433815-10439-89095477152699="` echo /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699 `" ) && sleep 0' 8975 1727204039.95344: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204039.95586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204039.95972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204039.95976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204039.95981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204039.95984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204039.97796: stdout chunk (state=3): >>>ansible-tmp-1727204039.9433815-10439-89095477152699=/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699 <<< 8975 1727204039.98076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204039.98081: stdout chunk (state=3): >>><<< 8975 1727204039.98083: stderr chunk (state=3): >>><<< 8975 1727204039.98086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204039.9433815-10439-89095477152699=/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204039.98088: variable 'ansible_module_compression' from source: unknown 8975 1727204039.98134: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204039.98172: variable 'ansible_facts' from source: unknown 8975 1727204039.98259: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py 8975 1727204039.98510: Sending initial data 8975 1727204039.98513: Sent initial data (151 bytes) 8975 1727204039.99088: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204039.99099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204039.99110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204039.99226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204039.99277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204039.99387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.00970: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204040.01022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204040.01123: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpsh6b9hu_ /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py <<< 8975 1727204040.01127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py" <<< 8975 1727204040.01253: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpsh6b9hu_" to remote "/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py" <<< 8975 1727204040.02623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.02655: stdout chunk (state=3): >>><<< 8975 1727204040.02659: stderr chunk (state=3): >>><<< 8975 1727204040.02712: done transferring module to remote 8975 1727204040.02763: _low_level_execute_command(): starting 8975 1727204040.02777: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/ /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py && sleep 0' 8975 1727204040.03439: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204040.03458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204040.03481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204040.03602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204040.03618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.03636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.03740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.05916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.05925: stdout chunk (state=3): >>><<< 8975 1727204040.05928: stderr chunk (state=3): >>><<< 8975 1727204040.05949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.05959: _low_level_execute_command(): starting 8975 1727204040.06047: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/AnsiballZ_stat.py && sleep 0' 8975 1727204040.06796: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204040.06820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204040.06970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.06978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.07052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.28431: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34089, "dev": 23, "nlink": 1, "atime": 1727204038.454705, "mtime": 1727204038.454705, "ctime": 1727204038.454705, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204040.29901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204040.29905: stdout chunk (state=3): >>><<< 8975 1727204040.29977: stderr chunk (state=3): >>><<< 8975 1727204040.30181: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34089, "dev": 23, "nlink": 1, "atime": 1727204038.454705, "mtime": 1727204038.454705, "ctime": 1727204038.454705, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204040.30185: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204040.30187: _low_level_execute_command(): starting 8975 1727204040.30190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204039.9433815-10439-89095477152699/ > /dev/null 2>&1 && sleep 0' 8975 1727204040.31071: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204040.31092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.31140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.31215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.33323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.33327: stdout chunk (state=3): >>><<< 8975 1727204040.33334: stderr chunk (state=3): >>><<< 8975 1727204040.33358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.33390: handler run complete 8975 1727204040.33472: attempt loop complete, returning result 8975 1727204040.33475: _execute() done 8975 1727204040.33478: dumping result to json 8975 1727204040.33480: done dumping result, returning 8975 1727204040.33482: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 [127b8e07-fff9-9356-306d-000000000153] 8975 1727204040.33484: sending task result for task 127b8e07-fff9-9356-306d-000000000153 ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204038.454705, "block_size": 4096, "blocks": 0, "ctime": 1727204038.454705, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34089, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204038.454705, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8975 1727204040.34027: no more pending results, returning what we have 8975 1727204040.34030: results queue empty 8975 1727204040.34031: checking for any_errors_fatal 8975 1727204040.34033: done checking for any_errors_fatal 8975 1727204040.34035: checking for max_fail_percentage 8975 1727204040.34037: done checking for max_fail_percentage 8975 1727204040.34038: checking to see if all hosts have failed and the running result is not ok 8975 1727204040.34038: done checking to see if all hosts have failed 8975 1727204040.34041: getting the remaining hosts for this loop 8975 1727204040.34043: done getting the remaining hosts for this loop 8975 1727204040.34049: getting the next task for host managed-node2 8975 1727204040.34058: done getting next task for host managed-node2 8975 1727204040.34061: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8975 1727204040.34064: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204040.34070: getting variables 8975 1727204040.34071: in VariableManager get_vars() 8975 1727204040.34111: Calling all_inventory to load vars for managed-node2 8975 1727204040.34114: Calling groups_inventory to load vars for managed-node2 8975 1727204040.34117: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.34130: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.34134: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.34139: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.34333: done sending task result for task 127b8e07-fff9-9356-306d-000000000153 8975 1727204040.34340: WORKER PROCESS EXITING 8975 1727204040.34375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.34980: done with get_vars() 8975 1727204040.34993: done getting variables 8975 1727204040.35239: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 8975 1727204040.35603: variable 'interface' from source: task vars 8975 1727204040.35609: variable 'dhcp_interface1' from source: play vars 8975 1727204040.35790: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.477) 0:00:11.676 ***** 8975 1727204040.35952: entering _queue_task() for managed-node2/assert 8975 1727204040.35954: Creating lock for assert 8975 1727204040.36583: worker is 1 (out of 1 available) 8975 1727204040.36598: exiting _queue_task() for managed-node2/assert 8975 1727204040.36615: done queuing things up, now waiting for results queue to drain 8975 1727204040.36616: waiting for pending results... 8975 1727204040.37188: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' 8975 1727204040.37522: in run() - task 127b8e07-fff9-9356-306d-000000000017 8975 1727204040.37530: variable 'ansible_search_path' from source: unknown 8975 1727204040.37533: variable 'ansible_search_path' from source: unknown 8975 1727204040.37660: calling self._execute() 8975 1727204040.37764: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.37779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.37820: variable 'omit' from source: magic vars 8975 1727204040.38513: variable 'ansible_distribution_major_version' from source: facts 8975 1727204040.38517: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204040.38520: variable 'omit' from source: magic vars 8975 1727204040.38547: variable 'omit' from source: magic vars 8975 1727204040.38698: variable 'interface' from source: task vars 8975 1727204040.38712: variable 'dhcp_interface1' from source: play vars 8975 1727204040.38876: variable 'dhcp_interface1' from source: play vars 8975 1727204040.39056: variable 'omit' from source: magic vars 8975 1727204040.39084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204040.39204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204040.39288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204040.39292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.39354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.39428: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204040.39439: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.39512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.39671: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204040.39675: Set connection var ansible_connection to ssh 8975 1727204040.39677: Set connection var ansible_shell_executable to /bin/sh 8975 1727204040.39680: Set connection var ansible_timeout to 10 8975 1727204040.39682: Set connection var ansible_shell_type to sh 8975 1727204040.39698: Set connection var ansible_pipelining to False 8975 1727204040.39776: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.39780: variable 'ansible_connection' from source: unknown 8975 1727204040.39783: variable 'ansible_module_compression' from source: unknown 8975 1727204040.39785: variable 'ansible_shell_type' from source: unknown 8975 1727204040.39787: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.39789: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.39791: variable 'ansible_pipelining' from source: unknown 8975 1727204040.39794: variable 'ansible_timeout' from source: unknown 8975 1727204040.39796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.40045: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204040.40076: variable 'omit' from source: magic vars 8975 1727204040.40080: starting attempt loop 8975 1727204040.40082: running the handler 8975 1727204040.40572: variable 'interface_stat' from source: set_fact 8975 1727204040.40602: Evaluated conditional (interface_stat.stat.exists): True 8975 1727204040.40635: handler run complete 8975 1727204040.40714: attempt loop complete, returning result 8975 1727204040.40717: _execute() done 8975 1727204040.40720: dumping result to json 8975 1727204040.40727: done dumping result, returning 8975 1727204040.40783: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' [127b8e07-fff9-9356-306d-000000000017] 8975 1727204040.40787: sending task result for task 127b8e07-fff9-9356-306d-000000000017 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204040.41490: no more pending results, returning what we have 8975 1727204040.41493: results queue empty 8975 1727204040.41494: checking for any_errors_fatal 8975 1727204040.41502: done checking for any_errors_fatal 8975 1727204040.41503: checking for max_fail_percentage 8975 1727204040.41505: done checking for max_fail_percentage 8975 1727204040.41506: checking to see if all hosts have failed and the running result is not ok 8975 1727204040.41507: done checking to see if all hosts have failed 8975 1727204040.41508: getting the remaining hosts for this loop 8975 1727204040.41510: done getting the remaining hosts for this loop 8975 1727204040.41513: getting the next task for host managed-node2 8975 1727204040.41522: done getting next task for host managed-node2 8975 1727204040.41525: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8975 1727204040.41528: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204040.41532: getting variables 8975 1727204040.41533: in VariableManager get_vars() 8975 1727204040.41577: Calling all_inventory to load vars for managed-node2 8975 1727204040.41580: Calling groups_inventory to load vars for managed-node2 8975 1727204040.41583: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.41594: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.41597: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.41600: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.42041: done sending task result for task 127b8e07-fff9-9356-306d-000000000017 8975 1727204040.42044: WORKER PROCESS EXITING 8975 1727204040.42148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.42523: done with get_vars() 8975 1727204040.42541: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.066) 0:00:11.743 ***** 8975 1727204040.42645: entering _queue_task() for managed-node2/include_tasks 8975 1727204040.42991: worker is 1 (out of 1 available) 8975 1727204040.43005: exiting _queue_task() for managed-node2/include_tasks 8975 1727204040.43019: done queuing things up, now waiting for results queue to drain 8975 1727204040.43020: waiting for pending results... 8975 1727204040.43752: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 8975 1727204040.44158: in run() - task 127b8e07-fff9-9356-306d-00000000001b 8975 1727204040.44187: variable 'ansible_search_path' from source: unknown 8975 1727204040.44202: variable 'ansible_search_path' from source: unknown 8975 1727204040.44250: calling self._execute() 8975 1727204040.44353: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.44369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.44384: variable 'omit' from source: magic vars 8975 1727204040.45109: variable 'ansible_distribution_major_version' from source: facts 8975 1727204040.45113: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204040.45116: _execute() done 8975 1727204040.45119: dumping result to json 8975 1727204040.45122: done dumping result, returning 8975 1727204040.45124: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-9356-306d-00000000001b] 8975 1727204040.45126: sending task result for task 127b8e07-fff9-9356-306d-00000000001b 8975 1727204040.45307: no more pending results, returning what we have 8975 1727204040.45312: in VariableManager get_vars() 8975 1727204040.45367: Calling all_inventory to load vars for managed-node2 8975 1727204040.45371: Calling groups_inventory to load vars for managed-node2 8975 1727204040.45373: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.45388: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.45391: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.45393: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.45809: done sending task result for task 127b8e07-fff9-9356-306d-00000000001b 8975 1727204040.45813: WORKER PROCESS EXITING 8975 1727204040.45841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.46126: done with get_vars() 8975 1727204040.46143: variable 'ansible_search_path' from source: unknown 8975 1727204040.46144: variable 'ansible_search_path' from source: unknown 8975 1727204040.46208: we have included files to process 8975 1727204040.46213: generating all_blocks data 8975 1727204040.46216: done generating all_blocks data 8975 1727204040.46226: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204040.46228: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204040.46233: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204040.46539: done processing included file 8975 1727204040.46541: iterating over new_blocks loaded from include file 8975 1727204040.46542: in VariableManager get_vars() 8975 1727204040.46573: done with get_vars() 8975 1727204040.46575: filtering new block on tags 8975 1727204040.46594: done filtering new block on tags 8975 1727204040.46596: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 8975 1727204040.46602: extending task lists for all hosts with included blocks 8975 1727204040.46774: done extending task lists 8975 1727204040.46775: done processing included files 8975 1727204040.46776: results queue empty 8975 1727204040.46780: checking for any_errors_fatal 8975 1727204040.46794: done checking for any_errors_fatal 8975 1727204040.46795: checking for max_fail_percentage 8975 1727204040.46797: done checking for max_fail_percentage 8975 1727204040.46798: checking to see if all hosts have failed and the running result is not ok 8975 1727204040.46798: done checking to see if all hosts have failed 8975 1727204040.46799: getting the remaining hosts for this loop 8975 1727204040.46801: done getting the remaining hosts for this loop 8975 1727204040.46806: getting the next task for host managed-node2 8975 1727204040.46814: done getting next task for host managed-node2 8975 1727204040.46817: ^ task is: TASK: Get stat for interface {{ interface }} 8975 1727204040.46820: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204040.46825: getting variables 8975 1727204040.46826: in VariableManager get_vars() 8975 1727204040.46846: Calling all_inventory to load vars for managed-node2 8975 1727204040.46849: Calling groups_inventory to load vars for managed-node2 8975 1727204040.46851: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.46857: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.46860: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.46863: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.47112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.47341: done with get_vars() 8975 1727204040.47354: done getting variables 8975 1727204040.47542: variable 'interface' from source: task vars 8975 1727204040.47546: variable 'dhcp_interface2' from source: play vars 8975 1727204040.47633: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.050) 0:00:11.794 ***** 8975 1727204040.47771: entering _queue_task() for managed-node2/stat 8975 1727204040.48197: worker is 1 (out of 1 available) 8975 1727204040.48327: exiting _queue_task() for managed-node2/stat 8975 1727204040.48340: done queuing things up, now waiting for results queue to drain 8975 1727204040.48342: waiting for pending results... 8975 1727204040.48481: running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 8975 1727204040.48630: in run() - task 127b8e07-fff9-9356-306d-00000000016b 8975 1727204040.48656: variable 'ansible_search_path' from source: unknown 8975 1727204040.48665: variable 'ansible_search_path' from source: unknown 8975 1727204040.48713: calling self._execute() 8975 1727204040.48942: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.48946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.48949: variable 'omit' from source: magic vars 8975 1727204040.49784: variable 'ansible_distribution_major_version' from source: facts 8975 1727204040.49822: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204040.49858: variable 'omit' from source: magic vars 8975 1727204040.49983: variable 'omit' from source: magic vars 8975 1727204040.50298: variable 'interface' from source: task vars 8975 1727204040.50334: variable 'dhcp_interface2' from source: play vars 8975 1727204040.50438: variable 'dhcp_interface2' from source: play vars 8975 1727204040.50470: variable 'omit' from source: magic vars 8975 1727204040.50532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204040.50582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204040.50621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204040.50651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.50673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.50720: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204040.50731: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.50740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.50870: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204040.50891: Set connection var ansible_connection to ssh 8975 1727204040.50894: Set connection var ansible_shell_executable to /bin/sh 8975 1727204040.50920: Set connection var ansible_timeout to 10 8975 1727204040.50925: Set connection var ansible_shell_type to sh 8975 1727204040.50999: Set connection var ansible_pipelining to False 8975 1727204040.51003: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.51005: variable 'ansible_connection' from source: unknown 8975 1727204040.51008: variable 'ansible_module_compression' from source: unknown 8975 1727204040.51010: variable 'ansible_shell_type' from source: unknown 8975 1727204040.51012: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.51014: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.51016: variable 'ansible_pipelining' from source: unknown 8975 1727204040.51019: variable 'ansible_timeout' from source: unknown 8975 1727204040.51021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.51272: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204040.51290: variable 'omit' from source: magic vars 8975 1727204040.51301: starting attempt loop 8975 1727204040.51308: running the handler 8975 1727204040.51336: _low_level_execute_command(): starting 8975 1727204040.51349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204040.52160: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204040.52252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204040.52308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.52342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.52468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.54391: stdout chunk (state=3): >>>/root <<< 8975 1727204040.54548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.54684: stderr chunk (state=3): >>><<< 8975 1727204040.54695: stdout chunk (state=3): >>><<< 8975 1727204040.54729: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.54783: _low_level_execute_command(): starting 8975 1727204040.54972: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623 `" && echo ansible-tmp-1727204040.5476832-10471-193218539067623="` echo /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623 `" ) && sleep 0' 8975 1727204040.56198: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204040.56439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.56448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.56531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.58536: stdout chunk (state=3): >>>ansible-tmp-1727204040.5476832-10471-193218539067623=/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623 <<< 8975 1727204040.58784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.58789: stdout chunk (state=3): >>><<< 8975 1727204040.58828: stderr chunk (state=3): >>><<< 8975 1727204040.58837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204040.5476832-10471-193218539067623=/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.58871: variable 'ansible_module_compression' from source: unknown 8975 1727204040.58934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204040.59041: variable 'ansible_facts' from source: unknown 8975 1727204040.59258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py 8975 1727204040.59702: Sending initial data 8975 1727204040.59706: Sent initial data (152 bytes) 8975 1727204040.60380: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204040.60452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204040.60496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.60593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.62235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204040.62296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204040.62382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpx4j7qk55 /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py <<< 8975 1727204040.62386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py" <<< 8975 1727204040.62433: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpx4j7qk55" to remote "/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py" <<< 8975 1727204040.64163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.64318: stderr chunk (state=3): >>><<< 8975 1727204040.64322: stdout chunk (state=3): >>><<< 8975 1727204040.64346: done transferring module to remote 8975 1727204040.64360: _low_level_execute_command(): starting 8975 1727204040.64369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/ /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py && sleep 0' 8975 1727204040.65030: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204040.65081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204040.65091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.65109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.65298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.67302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.67306: stdout chunk (state=3): >>><<< 8975 1727204040.67506: stderr chunk (state=3): >>><<< 8975 1727204040.67510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.67517: _low_level_execute_command(): starting 8975 1727204040.67519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/AnsiballZ_stat.py && sleep 0' 8975 1727204040.68776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204040.68897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.68901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.69111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.85672: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34495, "dev": 23, "nlink": 1, "atime": 1727204038.4607983, "mtime": 1727204038.4607983, "ctime": 1727204038.4607983, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204040.87074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204040.87155: stderr chunk (state=3): >>><<< 8975 1727204040.87468: stdout chunk (state=3): >>><<< 8975 1727204040.87474: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34495, "dev": 23, "nlink": 1, "atime": 1727204038.4607983, "mtime": 1727204038.4607983, "ctime": 1727204038.4607983, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204040.87476: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204040.87479: _low_level_execute_command(): starting 8975 1727204040.87481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204040.5476832-10471-193218539067623/ > /dev/null 2>&1 && sleep 0' 8975 1727204040.88186: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204040.88250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204040.88281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204040.88299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204040.88403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204040.90777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204040.90782: stdout chunk (state=3): >>><<< 8975 1727204040.90785: stderr chunk (state=3): >>><<< 8975 1727204040.90788: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204040.90795: handler run complete 8975 1727204040.90819: attempt loop complete, returning result 8975 1727204040.90835: _execute() done 8975 1727204040.90844: dumping result to json 8975 1727204040.90885: done dumping result, returning 8975 1727204040.90910: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 [127b8e07-fff9-9356-306d-00000000016b] 8975 1727204040.90921: sending task result for task 127b8e07-fff9-9356-306d-00000000016b ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204038.4607983, "block_size": 4096, "blocks": 0, "ctime": 1727204038.4607983, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34495, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204038.4607983, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8975 1727204040.91335: no more pending results, returning what we have 8975 1727204040.91339: results queue empty 8975 1727204040.91340: checking for any_errors_fatal 8975 1727204040.91342: done checking for any_errors_fatal 8975 1727204040.91343: checking for max_fail_percentage 8975 1727204040.91345: done checking for max_fail_percentage 8975 1727204040.91346: checking to see if all hosts have failed and the running result is not ok 8975 1727204040.91347: done checking to see if all hosts have failed 8975 1727204040.91348: getting the remaining hosts for this loop 8975 1727204040.91350: done getting the remaining hosts for this loop 8975 1727204040.91355: getting the next task for host managed-node2 8975 1727204040.91369: done getting next task for host managed-node2 8975 1727204040.91372: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8975 1727204040.91375: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204040.91381: getting variables 8975 1727204040.91383: in VariableManager get_vars() 8975 1727204040.91593: Calling all_inventory to load vars for managed-node2 8975 1727204040.91597: Calling groups_inventory to load vars for managed-node2 8975 1727204040.91599: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.91613: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.91617: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.91620: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.91935: done sending task result for task 127b8e07-fff9-9356-306d-00000000016b 8975 1727204040.91939: WORKER PROCESS EXITING 8975 1727204040.91964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.92462: done with get_vars() 8975 1727204040.92477: done getting variables 8975 1727204040.92545: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204040.92689: variable 'interface' from source: task vars 8975 1727204040.92693: variable 'dhcp_interface2' from source: play vars 8975 1727204040.92760: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.450) 0:00:12.244 ***** 8975 1727204040.92804: entering _queue_task() for managed-node2/assert 8975 1727204040.93403: worker is 1 (out of 1 available) 8975 1727204040.93413: exiting _queue_task() for managed-node2/assert 8975 1727204040.93426: done queuing things up, now waiting for results queue to drain 8975 1727204040.93427: waiting for pending results... 8975 1727204040.93497: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' 8975 1727204040.93657: in run() - task 127b8e07-fff9-9356-306d-00000000001c 8975 1727204040.93769: variable 'ansible_search_path' from source: unknown 8975 1727204040.93777: variable 'ansible_search_path' from source: unknown 8975 1727204040.93780: calling self._execute() 8975 1727204040.93896: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.93907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.93929: variable 'omit' from source: magic vars 8975 1727204040.94503: variable 'ansible_distribution_major_version' from source: facts 8975 1727204040.94536: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204040.94648: variable 'omit' from source: magic vars 8975 1727204040.94654: variable 'omit' from source: magic vars 8975 1727204040.94784: variable 'interface' from source: task vars 8975 1727204040.94795: variable 'dhcp_interface2' from source: play vars 8975 1727204040.94971: variable 'dhcp_interface2' from source: play vars 8975 1727204040.94975: variable 'omit' from source: magic vars 8975 1727204040.94985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204040.95051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204040.95097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204040.95138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.95160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204040.95225: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204040.95236: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.95245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.95421: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204040.95445: Set connection var ansible_connection to ssh 8975 1727204040.95461: Set connection var ansible_shell_executable to /bin/sh 8975 1727204040.95475: Set connection var ansible_timeout to 10 8975 1727204040.95483: Set connection var ansible_shell_type to sh 8975 1727204040.95521: Set connection var ansible_pipelining to False 8975 1727204040.95548: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.95557: variable 'ansible_connection' from source: unknown 8975 1727204040.95634: variable 'ansible_module_compression' from source: unknown 8975 1727204040.95637: variable 'ansible_shell_type' from source: unknown 8975 1727204040.95639: variable 'ansible_shell_executable' from source: unknown 8975 1727204040.95644: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.95646: variable 'ansible_pipelining' from source: unknown 8975 1727204040.95657: variable 'ansible_timeout' from source: unknown 8975 1727204040.95669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.95881: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204040.95914: variable 'omit' from source: magic vars 8975 1727204040.95973: starting attempt loop 8975 1727204040.95977: running the handler 8975 1727204040.96193: variable 'interface_stat' from source: set_fact 8975 1727204040.96242: Evaluated conditional (interface_stat.stat.exists): True 8975 1727204040.96255: handler run complete 8975 1727204040.96285: attempt loop complete, returning result 8975 1727204040.96304: _execute() done 8975 1727204040.96332: dumping result to json 8975 1727204040.96336: done dumping result, returning 8975 1727204040.96418: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' [127b8e07-fff9-9356-306d-00000000001c] 8975 1727204040.96422: sending task result for task 127b8e07-fff9-9356-306d-00000000001c ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204040.96741: no more pending results, returning what we have 8975 1727204040.96744: results queue empty 8975 1727204040.96745: checking for any_errors_fatal 8975 1727204040.96762: done checking for any_errors_fatal 8975 1727204040.96769: checking for max_fail_percentage 8975 1727204040.96771: done checking for max_fail_percentage 8975 1727204040.96772: checking to see if all hosts have failed and the running result is not ok 8975 1727204040.96775: done checking to see if all hosts have failed 8975 1727204040.96776: getting the remaining hosts for this loop 8975 1727204040.96781: done getting the remaining hosts for this loop 8975 1727204040.96788: getting the next task for host managed-node2 8975 1727204040.96798: done getting next task for host managed-node2 8975 1727204040.96801: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 8975 1727204040.96803: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204040.96811: getting variables 8975 1727204040.96815: in VariableManager get_vars() 8975 1727204040.97185: Calling all_inventory to load vars for managed-node2 8975 1727204040.97188: Calling groups_inventory to load vars for managed-node2 8975 1727204040.97190: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204040.97202: Calling all_plugins_play to load vars for managed-node2 8975 1727204040.97204: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204040.97208: Calling groups_plugins_play to load vars for managed-node2 8975 1727204040.97609: done sending task result for task 127b8e07-fff9-9356-306d-00000000001c 8975 1727204040.97614: WORKER PROCESS EXITING 8975 1727204040.97659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204040.97908: done with get_vars() 8975 1727204040.97921: done getting variables 8975 1727204040.98004: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.052) 0:00:12.297 ***** 8975 1727204040.98051: entering _queue_task() for managed-node2/command 8975 1727204040.98427: worker is 1 (out of 1 available) 8975 1727204040.98554: exiting _queue_task() for managed-node2/command 8975 1727204040.98569: done queuing things up, now waiting for results queue to drain 8975 1727204040.98571: waiting for pending results... 8975 1727204040.98904: running TaskExecutor() for managed-node2/TASK: Backup the /etc/resolv.conf for initscript 8975 1727204040.99046: in run() - task 127b8e07-fff9-9356-306d-00000000001d 8975 1727204040.99071: variable 'ansible_search_path' from source: unknown 8975 1727204040.99171: calling self._execute() 8975 1727204040.99247: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204040.99260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204040.99280: variable 'omit' from source: magic vars 8975 1727204040.99715: variable 'ansible_distribution_major_version' from source: facts 8975 1727204040.99736: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204040.99882: variable 'network_provider' from source: set_fact 8975 1727204040.99894: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204040.99968: when evaluation is False, skipping this task 8975 1727204040.99973: _execute() done 8975 1727204040.99976: dumping result to json 8975 1727204040.99979: done dumping result, returning 8975 1727204040.99981: done running TaskExecutor() for managed-node2/TASK: Backup the /etc/resolv.conf for initscript [127b8e07-fff9-9356-306d-00000000001d] 8975 1727204040.99984: sending task result for task 127b8e07-fff9-9356-306d-00000000001d 8975 1727204041.00273: done sending task result for task 127b8e07-fff9-9356-306d-00000000001d 8975 1727204041.00276: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8975 1727204041.00328: no more pending results, returning what we have 8975 1727204041.00332: results queue empty 8975 1727204041.00333: checking for any_errors_fatal 8975 1727204041.00338: done checking for any_errors_fatal 8975 1727204041.00339: checking for max_fail_percentage 8975 1727204041.00341: done checking for max_fail_percentage 8975 1727204041.00342: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.00343: done checking to see if all hosts have failed 8975 1727204041.00344: getting the remaining hosts for this loop 8975 1727204041.00345: done getting the remaining hosts for this loop 8975 1727204041.00349: getting the next task for host managed-node2 8975 1727204041.00355: done getting next task for host managed-node2 8975 1727204041.00358: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 8975 1727204041.00361: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.00367: getting variables 8975 1727204041.00368: in VariableManager get_vars() 8975 1727204041.00413: Calling all_inventory to load vars for managed-node2 8975 1727204041.00416: Calling groups_inventory to load vars for managed-node2 8975 1727204041.00419: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.00433: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.00437: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.00440: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.00814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.01039: done with get_vars() 8975 1727204041.01052: done getting variables 8975 1727204041.01122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.031) 0:00:12.328 ***** 8975 1727204041.01156: entering _queue_task() for managed-node2/debug 8975 1727204041.01596: worker is 1 (out of 1 available) 8975 1727204041.01611: exiting _queue_task() for managed-node2/debug 8975 1727204041.01627: done queuing things up, now waiting for results queue to drain 8975 1727204041.01629: waiting for pending results... 8975 1727204041.01851: running TaskExecutor() for managed-node2/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 8975 1727204041.01957: in run() - task 127b8e07-fff9-9356-306d-00000000001e 8975 1727204041.01981: variable 'ansible_search_path' from source: unknown 8975 1727204041.02052: calling self._execute() 8975 1727204041.02210: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.02214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.02219: variable 'omit' from source: magic vars 8975 1727204041.02738: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.02769: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.02864: variable 'omit' from source: magic vars 8975 1727204041.02869: variable 'omit' from source: magic vars 8975 1727204041.02888: variable 'omit' from source: magic vars 8975 1727204041.02956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204041.03010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204041.03043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204041.03071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204041.03101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204041.03156: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204041.03172: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.03234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.03343: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204041.03364: Set connection var ansible_connection to ssh 8975 1727204041.03384: Set connection var ansible_shell_executable to /bin/sh 8975 1727204041.03396: Set connection var ansible_timeout to 10 8975 1727204041.03404: Set connection var ansible_shell_type to sh 8975 1727204041.03426: Set connection var ansible_pipelining to False 8975 1727204041.03563: variable 'ansible_shell_executable' from source: unknown 8975 1727204041.03570: variable 'ansible_connection' from source: unknown 8975 1727204041.03577: variable 'ansible_module_compression' from source: unknown 8975 1727204041.03580: variable 'ansible_shell_type' from source: unknown 8975 1727204041.03583: variable 'ansible_shell_executable' from source: unknown 8975 1727204041.03585: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.03589: variable 'ansible_pipelining' from source: unknown 8975 1727204041.03592: variable 'ansible_timeout' from source: unknown 8975 1727204041.03595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.03820: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204041.03826: variable 'omit' from source: magic vars 8975 1727204041.03830: starting attempt loop 8975 1727204041.03832: running the handler 8975 1727204041.03834: handler run complete 8975 1727204041.03851: attempt loop complete, returning result 8975 1727204041.03859: _execute() done 8975 1727204041.03867: dumping result to json 8975 1727204041.03879: done dumping result, returning 8975 1727204041.03902: done running TaskExecutor() for managed-node2/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [127b8e07-fff9-9356-306d-00000000001e] 8975 1727204041.03922: sending task result for task 127b8e07-fff9-9356-306d-00000000001e ok: [managed-node2] => {} MSG: ################################################## 8975 1727204041.04260: no more pending results, returning what we have 8975 1727204041.04263: results queue empty 8975 1727204041.04264: checking for any_errors_fatal 8975 1727204041.04273: done checking for any_errors_fatal 8975 1727204041.04274: checking for max_fail_percentage 8975 1727204041.04275: done checking for max_fail_percentage 8975 1727204041.04276: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.04278: done checking to see if all hosts have failed 8975 1727204041.04278: getting the remaining hosts for this loop 8975 1727204041.04280: done getting the remaining hosts for this loop 8975 1727204041.04284: getting the next task for host managed-node2 8975 1727204041.04293: done getting next task for host managed-node2 8975 1727204041.04300: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8975 1727204041.04304: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.04331: getting variables 8975 1727204041.04334: in VariableManager get_vars() 8975 1727204041.04505: Calling all_inventory to load vars for managed-node2 8975 1727204041.04508: Calling groups_inventory to load vars for managed-node2 8975 1727204041.04514: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.04574: done sending task result for task 127b8e07-fff9-9356-306d-00000000001e 8975 1727204041.04579: WORKER PROCESS EXITING 8975 1727204041.04593: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.04597: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.04601: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.04883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.05147: done with get_vars() 8975 1727204041.05160: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.041) 0:00:12.369 ***** 8975 1727204041.05287: entering _queue_task() for managed-node2/include_tasks 8975 1727204041.05650: worker is 1 (out of 1 available) 8975 1727204041.05663: exiting _queue_task() for managed-node2/include_tasks 8975 1727204041.05772: done queuing things up, now waiting for results queue to drain 8975 1727204041.05774: waiting for pending results... 8975 1727204041.05995: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8975 1727204041.06158: in run() - task 127b8e07-fff9-9356-306d-000000000026 8975 1727204041.06188: variable 'ansible_search_path' from source: unknown 8975 1727204041.06201: variable 'ansible_search_path' from source: unknown 8975 1727204041.06253: calling self._execute() 8975 1727204041.06359: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.06374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.06413: variable 'omit' from source: magic vars 8975 1727204041.06852: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.06875: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.06891: _execute() done 8975 1727204041.06961: dumping result to json 8975 1727204041.06965: done dumping result, returning 8975 1727204041.06969: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-9356-306d-000000000026] 8975 1727204041.06972: sending task result for task 127b8e07-fff9-9356-306d-000000000026 8975 1727204041.07206: no more pending results, returning what we have 8975 1727204041.07212: in VariableManager get_vars() 8975 1727204041.07273: Calling all_inventory to load vars for managed-node2 8975 1727204041.07277: Calling groups_inventory to load vars for managed-node2 8975 1727204041.07280: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.07294: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.07298: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.07301: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.07796: done sending task result for task 127b8e07-fff9-9356-306d-000000000026 8975 1727204041.07799: WORKER PROCESS EXITING 8975 1727204041.07830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.08052: done with get_vars() 8975 1727204041.08060: variable 'ansible_search_path' from source: unknown 8975 1727204041.08061: variable 'ansible_search_path' from source: unknown 8975 1727204041.08101: we have included files to process 8975 1727204041.08102: generating all_blocks data 8975 1727204041.08103: done generating all_blocks data 8975 1727204041.08108: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204041.08109: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204041.08111: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204041.08963: done processing included file 8975 1727204041.08968: iterating over new_blocks loaded from include file 8975 1727204041.08970: in VariableManager get_vars() 8975 1727204041.09019: done with get_vars() 8975 1727204041.09021: filtering new block on tags 8975 1727204041.09047: done filtering new block on tags 8975 1727204041.09051: in VariableManager get_vars() 8975 1727204041.09085: done with get_vars() 8975 1727204041.09091: filtering new block on tags 8975 1727204041.09121: done filtering new block on tags 8975 1727204041.09126: in VariableManager get_vars() 8975 1727204041.09158: done with get_vars() 8975 1727204041.09160: filtering new block on tags 8975 1727204041.09181: done filtering new block on tags 8975 1727204041.09184: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 8975 1727204041.09189: extending task lists for all hosts with included blocks 8975 1727204041.10739: done extending task lists 8975 1727204041.10741: done processing included files 8975 1727204041.10742: results queue empty 8975 1727204041.10743: checking for any_errors_fatal 8975 1727204041.10746: done checking for any_errors_fatal 8975 1727204041.10747: checking for max_fail_percentage 8975 1727204041.10748: done checking for max_fail_percentage 8975 1727204041.10749: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.10780: done checking to see if all hosts have failed 8975 1727204041.10781: getting the remaining hosts for this loop 8975 1727204041.10783: done getting the remaining hosts for this loop 8975 1727204041.10786: getting the next task for host managed-node2 8975 1727204041.10792: done getting next task for host managed-node2 8975 1727204041.10795: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8975 1727204041.10798: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.10836: getting variables 8975 1727204041.10837: in VariableManager get_vars() 8975 1727204041.10908: Calling all_inventory to load vars for managed-node2 8975 1727204041.10937: Calling groups_inventory to load vars for managed-node2 8975 1727204041.10940: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.10947: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.10950: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.10953: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.11252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.11754: done with get_vars() 8975 1727204041.11810: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.066) 0:00:12.436 ***** 8975 1727204041.11947: entering _queue_task() for managed-node2/setup 8975 1727204041.12398: worker is 1 (out of 1 available) 8975 1727204041.12413: exiting _queue_task() for managed-node2/setup 8975 1727204041.12427: done queuing things up, now waiting for results queue to drain 8975 1727204041.12428: waiting for pending results... 8975 1727204041.12683: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8975 1727204041.12886: in run() - task 127b8e07-fff9-9356-306d-000000000189 8975 1727204041.12915: variable 'ansible_search_path' from source: unknown 8975 1727204041.12930: variable 'ansible_search_path' from source: unknown 8975 1727204041.12989: calling self._execute() 8975 1727204041.13080: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.13095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.13110: variable 'omit' from source: magic vars 8975 1727204041.13662: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.13691: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.14076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204041.17333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204041.17426: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204041.17478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204041.17534: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204041.17583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204041.17661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204041.17773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204041.17777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204041.17782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204041.17809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204041.17880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204041.17917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204041.17954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204041.18008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204041.18038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204041.18235: variable '__network_required_facts' from source: role '' defaults 8975 1727204041.18254: variable 'ansible_facts' from source: unknown 8975 1727204041.18363: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8975 1727204041.18453: when evaluation is False, skipping this task 8975 1727204041.18456: _execute() done 8975 1727204041.18459: dumping result to json 8975 1727204041.18463: done dumping result, returning 8975 1727204041.18468: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-9356-306d-000000000189] 8975 1727204041.18472: sending task result for task 127b8e07-fff9-9356-306d-000000000189 8975 1727204041.18668: done sending task result for task 127b8e07-fff9-9356-306d-000000000189 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204041.18719: no more pending results, returning what we have 8975 1727204041.18722: results queue empty 8975 1727204041.18723: checking for any_errors_fatal 8975 1727204041.18724: done checking for any_errors_fatal 8975 1727204041.18725: checking for max_fail_percentage 8975 1727204041.18726: done checking for max_fail_percentage 8975 1727204041.18728: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.18729: done checking to see if all hosts have failed 8975 1727204041.18729: getting the remaining hosts for this loop 8975 1727204041.18731: done getting the remaining hosts for this loop 8975 1727204041.18735: getting the next task for host managed-node2 8975 1727204041.18744: done getting next task for host managed-node2 8975 1727204041.18748: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8975 1727204041.18752: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.18769: getting variables 8975 1727204041.18771: in VariableManager get_vars() 8975 1727204041.18815: Calling all_inventory to load vars for managed-node2 8975 1727204041.18818: Calling groups_inventory to load vars for managed-node2 8975 1727204041.18820: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.18830: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.18833: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.18836: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.19051: WORKER PROCESS EXITING 8975 1727204041.19079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.19707: done with get_vars() 8975 1727204041.19721: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.083) 0:00:12.519 ***** 8975 1727204041.20257: entering _queue_task() for managed-node2/stat 8975 1727204041.21113: worker is 1 (out of 1 available) 8975 1727204041.21127: exiting _queue_task() for managed-node2/stat 8975 1727204041.21139: done queuing things up, now waiting for results queue to drain 8975 1727204041.21140: waiting for pending results... 8975 1727204041.21463: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 8975 1727204041.21974: in run() - task 127b8e07-fff9-9356-306d-00000000018b 8975 1727204041.21979: variable 'ansible_search_path' from source: unknown 8975 1727204041.21982: variable 'ansible_search_path' from source: unknown 8975 1727204041.21986: calling self._execute() 8975 1727204041.22062: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.22070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.22083: variable 'omit' from source: magic vars 8975 1727204041.23092: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.23102: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.23497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204041.24183: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204041.24647: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204041.24651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204041.25273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204041.25277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204041.25280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204041.25282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204041.25284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204041.25665: variable '__network_is_ostree' from source: set_fact 8975 1727204041.25673: Evaluated conditional (not __network_is_ostree is defined): False 8975 1727204041.25676: when evaluation is False, skipping this task 8975 1727204041.25679: _execute() done 8975 1727204041.25681: dumping result to json 8975 1727204041.25730: done dumping result, returning 8975 1727204041.25734: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-9356-306d-00000000018b] 8975 1727204041.25736: sending task result for task 127b8e07-fff9-9356-306d-00000000018b 8975 1727204041.25811: done sending task result for task 127b8e07-fff9-9356-306d-00000000018b 8975 1727204041.25814: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8975 1727204041.25893: no more pending results, returning what we have 8975 1727204041.25897: results queue empty 8975 1727204041.25897: checking for any_errors_fatal 8975 1727204041.25906: done checking for any_errors_fatal 8975 1727204041.25907: checking for max_fail_percentage 8975 1727204041.25908: done checking for max_fail_percentage 8975 1727204041.25910: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.25911: done checking to see if all hosts have failed 8975 1727204041.25912: getting the remaining hosts for this loop 8975 1727204041.25914: done getting the remaining hosts for this loop 8975 1727204041.25919: getting the next task for host managed-node2 8975 1727204041.25930: done getting next task for host managed-node2 8975 1727204041.25934: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8975 1727204041.25938: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.25955: getting variables 8975 1727204041.25957: in VariableManager get_vars() 8975 1727204041.26009: Calling all_inventory to load vars for managed-node2 8975 1727204041.26012: Calling groups_inventory to load vars for managed-node2 8975 1727204041.26014: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.26028: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.26032: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.26035: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.26416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.26704: done with get_vars() 8975 1727204041.26719: done getting variables 8975 1727204041.27001: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.067) 0:00:12.587 ***** 8975 1727204041.27050: entering _queue_task() for managed-node2/set_fact 8975 1727204041.27920: worker is 1 (out of 1 available) 8975 1727204041.27937: exiting _queue_task() for managed-node2/set_fact 8975 1727204041.27954: done queuing things up, now waiting for results queue to drain 8975 1727204041.27956: waiting for pending results... 8975 1727204041.28588: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8975 1727204041.29079: in run() - task 127b8e07-fff9-9356-306d-00000000018c 8975 1727204041.29095: variable 'ansible_search_path' from source: unknown 8975 1727204041.29098: variable 'ansible_search_path' from source: unknown 8975 1727204041.29272: calling self._execute() 8975 1727204041.29438: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.29442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.29454: variable 'omit' from source: magic vars 8975 1727204041.30429: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.30437: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.30828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204041.31485: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204041.31567: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204041.31573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204041.31617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204041.31722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204041.31755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204041.31779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204041.31906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204041.31916: variable '__network_is_ostree' from source: set_fact 8975 1727204041.31928: Evaluated conditional (not __network_is_ostree is defined): False 8975 1727204041.31932: when evaluation is False, skipping this task 8975 1727204041.31935: _execute() done 8975 1727204041.31938: dumping result to json 8975 1727204041.31940: done dumping result, returning 8975 1727204041.32005: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-9356-306d-00000000018c] 8975 1727204041.32010: sending task result for task 127b8e07-fff9-9356-306d-00000000018c 8975 1727204041.32081: done sending task result for task 127b8e07-fff9-9356-306d-00000000018c 8975 1727204041.32084: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8975 1727204041.32156: no more pending results, returning what we have 8975 1727204041.32160: results queue empty 8975 1727204041.32161: checking for any_errors_fatal 8975 1727204041.32167: done checking for any_errors_fatal 8975 1727204041.32168: checking for max_fail_percentage 8975 1727204041.32170: done checking for max_fail_percentage 8975 1727204041.32170: checking to see if all hosts have failed and the running result is not ok 8975 1727204041.32172: done checking to see if all hosts have failed 8975 1727204041.32172: getting the remaining hosts for this loop 8975 1727204041.32174: done getting the remaining hosts for this loop 8975 1727204041.32178: getting the next task for host managed-node2 8975 1727204041.32188: done getting next task for host managed-node2 8975 1727204041.32192: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8975 1727204041.32195: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204041.32209: getting variables 8975 1727204041.32210: in VariableManager get_vars() 8975 1727204041.32251: Calling all_inventory to load vars for managed-node2 8975 1727204041.32255: Calling groups_inventory to load vars for managed-node2 8975 1727204041.32256: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204041.32470: Calling all_plugins_play to load vars for managed-node2 8975 1727204041.32475: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204041.32479: Calling groups_plugins_play to load vars for managed-node2 8975 1727204041.32921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204041.33326: done with get_vars() 8975 1727204041.33338: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.063) 0:00:12.651 ***** 8975 1727204041.33443: entering _queue_task() for managed-node2/service_facts 8975 1727204041.33445: Creating lock for service_facts 8975 1727204041.34405: worker is 1 (out of 1 available) 8975 1727204041.34418: exiting _queue_task() for managed-node2/service_facts 8975 1727204041.34435: done queuing things up, now waiting for results queue to drain 8975 1727204041.34436: waiting for pending results... 8975 1727204041.35091: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 8975 1727204041.35097: in run() - task 127b8e07-fff9-9356-306d-00000000018e 8975 1727204041.35099: variable 'ansible_search_path' from source: unknown 8975 1727204041.35102: variable 'ansible_search_path' from source: unknown 8975 1727204041.35105: calling self._execute() 8975 1727204041.35132: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.35137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.35148: variable 'omit' from source: magic vars 8975 1727204041.35599: variable 'ansible_distribution_major_version' from source: facts 8975 1727204041.35612: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204041.35621: variable 'omit' from source: magic vars 8975 1727204041.35773: variable 'omit' from source: magic vars 8975 1727204041.35776: variable 'omit' from source: magic vars 8975 1727204041.35834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204041.35874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204041.35905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204041.35927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204041.35937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204041.36171: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204041.36177: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.36180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.36183: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204041.36185: Set connection var ansible_connection to ssh 8975 1727204041.36188: Set connection var ansible_shell_executable to /bin/sh 8975 1727204041.36190: Set connection var ansible_timeout to 10 8975 1727204041.36193: Set connection var ansible_shell_type to sh 8975 1727204041.36195: Set connection var ansible_pipelining to False 8975 1727204041.36197: variable 'ansible_shell_executable' from source: unknown 8975 1727204041.36200: variable 'ansible_connection' from source: unknown 8975 1727204041.36203: variable 'ansible_module_compression' from source: unknown 8975 1727204041.36205: variable 'ansible_shell_type' from source: unknown 8975 1727204041.36207: variable 'ansible_shell_executable' from source: unknown 8975 1727204041.36209: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204041.36211: variable 'ansible_pipelining' from source: unknown 8975 1727204041.36214: variable 'ansible_timeout' from source: unknown 8975 1727204041.36216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204041.36445: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204041.36455: variable 'omit' from source: magic vars 8975 1727204041.36460: starting attempt loop 8975 1727204041.36463: running the handler 8975 1727204041.36480: _low_level_execute_command(): starting 8975 1727204041.36488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204041.37383: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204041.37387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204041.37390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204041.37392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204041.37400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204041.37402: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204041.37405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.37409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.37442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204041.37456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204041.37535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204041.37695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204041.39500: stdout chunk (state=3): >>>/root <<< 8975 1727204041.39956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204041.39960: stdout chunk (state=3): >>><<< 8975 1727204041.40011: stderr chunk (state=3): >>><<< 8975 1727204041.40016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204041.40019: _low_level_execute_command(): starting 8975 1727204041.40022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961 `" && echo ansible-tmp-1727204041.3999565-10509-41797392424961="` echo /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961 `" ) && sleep 0' 8975 1727204041.42590: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.42763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204041.42830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204041.42850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204041.42951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204041.45005: stdout chunk (state=3): >>>ansible-tmp-1727204041.3999565-10509-41797392424961=/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961 <<< 8975 1727204041.45114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204041.45417: stderr chunk (state=3): >>><<< 8975 1727204041.45449: stdout chunk (state=3): >>><<< 8975 1727204041.45470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204041.3999565-10509-41797392424961=/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204041.45552: variable 'ansible_module_compression' from source: unknown 8975 1727204041.45581: ANSIBALLZ: Using lock for service_facts 8975 1727204041.45584: ANSIBALLZ: Acquiring lock 8975 1727204041.45587: ANSIBALLZ: Lock acquired: 140501803048768 8975 1727204041.45592: ANSIBALLZ: Creating module 8975 1727204041.74139: ANSIBALLZ: Writing module into payload 8975 1727204041.74379: ANSIBALLZ: Writing module 8975 1727204041.74670: ANSIBALLZ: Renaming module 8975 1727204041.74677: ANSIBALLZ: Done creating module 8975 1727204041.74679: variable 'ansible_facts' from source: unknown 8975 1727204041.74718: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py 8975 1727204041.75236: Sending initial data 8975 1727204041.75240: Sent initial data (160 bytes) 8975 1727204041.76539: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.76606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204041.76749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204041.76791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204041.77009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204041.79026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204041.79071: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpxv0f4qpa /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py <<< 8975 1727204041.79075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py" <<< 8975 1727204041.79136: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpxv0f4qpa" to remote "/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py" <<< 8975 1727204041.80590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204041.80672: stderr chunk (state=3): >>><<< 8975 1727204041.80675: stdout chunk (state=3): >>><<< 8975 1727204041.80701: done transferring module to remote 8975 1727204041.80713: _low_level_execute_command(): starting 8975 1727204041.80718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/ /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py && sleep 0' 8975 1727204041.81658: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204041.81663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.81687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204041.81691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204041.81787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204041.81890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204041.81898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204041.83973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204041.83978: stderr chunk (state=3): >>><<< 8975 1727204041.83980: stdout chunk (state=3): >>><<< 8975 1727204041.83983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204041.83985: _low_level_execute_command(): starting 8975 1727204041.83987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/AnsiballZ_service_facts.py && sleep 0' 8975 1727204041.84629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204041.84659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204041.84683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204041.84711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204041.84822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204041.84838: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204041.85114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204041.85219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204044.01528: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.<<< 8975 1727204044.01635: stdout chunk (state=3): >>>service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "deb<<< 8975 1727204044.01659: stdout chunk (state=3): >>>ug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8975 1727204044.03224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204044.03241: stdout chunk (state=3): >>><<< 8975 1727204044.03287: stderr chunk (state=3): >>><<< 8975 1727204044.03315: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204044.08074: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204044.08079: _low_level_execute_command(): starting 8975 1727204044.08081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204041.3999565-10509-41797392424961/ > /dev/null 2>&1 && sleep 0' 8975 1727204044.09529: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204044.09536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204044.09744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204044.09891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204044.09986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204044.12055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204044.12095: stderr chunk (state=3): >>><<< 8975 1727204044.12098: stdout chunk (state=3): >>><<< 8975 1727204044.12180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204044.12186: handler run complete 8975 1727204044.12640: variable 'ansible_facts' from source: unknown 8975 1727204044.13094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204044.14390: variable 'ansible_facts' from source: unknown 8975 1727204044.14786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204044.15265: attempt loop complete, returning result 8975 1727204044.15418: _execute() done 8975 1727204044.15422: dumping result to json 8975 1727204044.15499: done dumping result, returning 8975 1727204044.15569: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-9356-306d-00000000018e] 8975 1727204044.15576: sending task result for task 127b8e07-fff9-9356-306d-00000000018e 8975 1727204044.17345: done sending task result for task 127b8e07-fff9-9356-306d-00000000018e 8975 1727204044.17349: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204044.17444: no more pending results, returning what we have 8975 1727204044.17447: results queue empty 8975 1727204044.17448: checking for any_errors_fatal 8975 1727204044.17452: done checking for any_errors_fatal 8975 1727204044.17453: checking for max_fail_percentage 8975 1727204044.17455: done checking for max_fail_percentage 8975 1727204044.17456: checking to see if all hosts have failed and the running result is not ok 8975 1727204044.17457: done checking to see if all hosts have failed 8975 1727204044.17458: getting the remaining hosts for this loop 8975 1727204044.17459: done getting the remaining hosts for this loop 8975 1727204044.17463: getting the next task for host managed-node2 8975 1727204044.17471: done getting next task for host managed-node2 8975 1727204044.17474: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8975 1727204044.17478: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204044.17488: getting variables 8975 1727204044.17490: in VariableManager get_vars() 8975 1727204044.17608: Calling all_inventory to load vars for managed-node2 8975 1727204044.17612: Calling groups_inventory to load vars for managed-node2 8975 1727204044.17615: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204044.17625: Calling all_plugins_play to load vars for managed-node2 8975 1727204044.17630: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204044.17634: Calling groups_plugins_play to load vars for managed-node2 8975 1727204044.19014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204044.20280: done with get_vars() 8975 1727204044.20299: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:04 -0400 (0:00:02.870) 0:00:15.523 ***** 8975 1727204044.20651: entering _queue_task() for managed-node2/package_facts 8975 1727204044.20653: Creating lock for package_facts 8975 1727204044.21376: worker is 1 (out of 1 available) 8975 1727204044.21390: exiting _queue_task() for managed-node2/package_facts 8975 1727204044.21475: done queuing things up, now waiting for results queue to drain 8975 1727204044.21477: waiting for pending results... 8975 1727204044.21851: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 8975 1727204044.22373: in run() - task 127b8e07-fff9-9356-306d-00000000018f 8975 1727204044.22377: variable 'ansible_search_path' from source: unknown 8975 1727204044.22380: variable 'ansible_search_path' from source: unknown 8975 1727204044.22383: calling self._execute() 8975 1727204044.22773: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204044.22777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204044.22779: variable 'omit' from source: magic vars 8975 1727204044.23371: variable 'ansible_distribution_major_version' from source: facts 8975 1727204044.23391: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204044.23678: variable 'omit' from source: magic vars 8975 1727204044.23681: variable 'omit' from source: magic vars 8975 1727204044.23723: variable 'omit' from source: magic vars 8975 1727204044.23776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204044.24170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204044.24174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204044.24176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204044.24178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204044.24181: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204044.24183: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204044.24186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204044.24419: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204044.24427: Set connection var ansible_connection to ssh 8975 1727204044.24439: Set connection var ansible_shell_executable to /bin/sh 8975 1727204044.24449: Set connection var ansible_timeout to 10 8975 1727204044.24456: Set connection var ansible_shell_type to sh 8975 1727204044.24475: Set connection var ansible_pipelining to False 8975 1727204044.24506: variable 'ansible_shell_executable' from source: unknown 8975 1727204044.24513: variable 'ansible_connection' from source: unknown 8975 1727204044.24521: variable 'ansible_module_compression' from source: unknown 8975 1727204044.24527: variable 'ansible_shell_type' from source: unknown 8975 1727204044.24534: variable 'ansible_shell_executable' from source: unknown 8975 1727204044.24540: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204044.24548: variable 'ansible_pipelining' from source: unknown 8975 1727204044.24554: variable 'ansible_timeout' from source: unknown 8975 1727204044.24564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204044.24987: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204044.25086: variable 'omit' from source: magic vars 8975 1727204044.25096: starting attempt loop 8975 1727204044.25103: running the handler 8975 1727204044.25123: _low_level_execute_command(): starting 8975 1727204044.25135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204044.26810: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204044.26816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204044.26820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204044.27087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204044.27156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204044.28864: stdout chunk (state=3): >>>/root <<< 8975 1727204044.29052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204044.29260: stderr chunk (state=3): >>><<< 8975 1727204044.29273: stdout chunk (state=3): >>><<< 8975 1727204044.29303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204044.29574: _low_level_execute_command(): starting 8975 1727204044.29579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200 `" && echo ansible-tmp-1727204044.2947273-10737-247250191066200="` echo /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200 `" ) && sleep 0' 8975 1727204044.30824: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204044.30842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204044.30988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204044.31083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204044.31178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204044.33158: stdout chunk (state=3): >>>ansible-tmp-1727204044.2947273-10737-247250191066200=/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200 <<< 8975 1727204044.33287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204044.33354: stderr chunk (state=3): >>><<< 8975 1727204044.33363: stdout chunk (state=3): >>><<< 8975 1727204044.33389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204044.2947273-10737-247250191066200=/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204044.33772: variable 'ansible_module_compression' from source: unknown 8975 1727204044.33776: ANSIBALLZ: Using lock for package_facts 8975 1727204044.33779: ANSIBALLZ: Acquiring lock 8975 1727204044.33781: ANSIBALLZ: Lock acquired: 140501806229328 8975 1727204044.33783: ANSIBALLZ: Creating module 8975 1727204044.98695: ANSIBALLZ: Writing module into payload 8975 1727204044.98862: ANSIBALLZ: Writing module 8975 1727204044.98904: ANSIBALLZ: Renaming module 8975 1727204044.98911: ANSIBALLZ: Done creating module 8975 1727204044.98940: variable 'ansible_facts' from source: unknown 8975 1727204044.99123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py 8975 1727204044.99324: Sending initial data 8975 1727204044.99328: Sent initial data (161 bytes) 8975 1727204045.00056: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204045.00176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204045.00200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204045.00316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204045.02077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204045.02197: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204045.02255: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmphxu9opxy /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py <<< 8975 1727204045.02259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py" <<< 8975 1727204045.02473: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmphxu9opxy" to remote "/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py" <<< 8975 1727204045.05470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204045.05614: stderr chunk (state=3): >>><<< 8975 1727204045.05618: stdout chunk (state=3): >>><<< 8975 1727204045.05645: done transferring module to remote 8975 1727204045.05657: _low_level_execute_command(): starting 8975 1727204045.05662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/ /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py && sleep 0' 8975 1727204045.07188: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204045.07199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204045.07374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204045.07440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204045.07658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204045.09528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204045.09764: stderr chunk (state=3): >>><<< 8975 1727204045.09770: stdout chunk (state=3): >>><<< 8975 1727204045.09773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204045.09775: _low_level_execute_command(): starting 8975 1727204045.09778: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/AnsiballZ_package_facts.py && sleep 0' 8975 1727204045.10357: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204045.10375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204045.10390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204045.10410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204045.10491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204045.10549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204045.10570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204045.10681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204045.73225: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 8975 1727204045.73283: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 8975 1727204045.73332: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 8975 1727204045.73349: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 8975 1727204045.73387: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 8975 1727204045.73404: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 8975 1727204045.73435: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 8975 1727204045.73452: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8975 1727204045.75391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204045.75462: stderr chunk (state=3): >>><<< 8975 1727204045.75467: stdout chunk (state=3): >>><<< 8975 1727204045.75505: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204045.77705: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204045.77723: _low_level_execute_command(): starting 8975 1727204045.77729: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204044.2947273-10737-247250191066200/ > /dev/null 2>&1 && sleep 0' 8975 1727204045.78273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204045.78277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204045.78280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204045.78282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204045.78284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204045.78334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204045.78342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204045.78346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204045.78412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204045.80511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204045.80516: stdout chunk (state=3): >>><<< 8975 1727204045.80518: stderr chunk (state=3): >>><<< 8975 1727204045.80521: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204045.80523: handler run complete 8975 1727204045.81353: variable 'ansible_facts' from source: unknown 8975 1727204045.81767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204045.83705: variable 'ansible_facts' from source: unknown 8975 1727204045.84373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204045.85235: attempt loop complete, returning result 8975 1727204045.85253: _execute() done 8975 1727204045.85257: dumping result to json 8975 1727204045.85534: done dumping result, returning 8975 1727204045.85538: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-9356-306d-00000000018f] 8975 1727204045.85541: sending task result for task 127b8e07-fff9-9356-306d-00000000018f 8975 1727204045.87658: done sending task result for task 127b8e07-fff9-9356-306d-00000000018f 8975 1727204045.87660: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204045.87761: no more pending results, returning what we have 8975 1727204045.87763: results queue empty 8975 1727204045.87764: checking for any_errors_fatal 8975 1727204045.87770: done checking for any_errors_fatal 8975 1727204045.87770: checking for max_fail_percentage 8975 1727204045.87771: done checking for max_fail_percentage 8975 1727204045.87772: checking to see if all hosts have failed and the running result is not ok 8975 1727204045.87773: done checking to see if all hosts have failed 8975 1727204045.87773: getting the remaining hosts for this loop 8975 1727204045.87774: done getting the remaining hosts for this loop 8975 1727204045.87777: getting the next task for host managed-node2 8975 1727204045.87782: done getting next task for host managed-node2 8975 1727204045.87785: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8975 1727204045.87787: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204045.87794: getting variables 8975 1727204045.87795: in VariableManager get_vars() 8975 1727204045.87829: Calling all_inventory to load vars for managed-node2 8975 1727204045.87832: Calling groups_inventory to load vars for managed-node2 8975 1727204045.87835: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204045.87846: Calling all_plugins_play to load vars for managed-node2 8975 1727204045.87848: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204045.87852: Calling groups_plugins_play to load vars for managed-node2 8975 1727204045.89013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204045.91210: done with get_vars() 8975 1727204045.91247: done getting variables 8975 1727204045.91318: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:05 -0400 (0:00:01.707) 0:00:17.230 ***** 8975 1727204045.91369: entering _queue_task() for managed-node2/debug 8975 1727204045.91732: worker is 1 (out of 1 available) 8975 1727204045.91743: exiting _queue_task() for managed-node2/debug 8975 1727204045.91757: done queuing things up, now waiting for results queue to drain 8975 1727204045.91759: waiting for pending results... 8975 1727204045.92188: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 8975 1727204045.92207: in run() - task 127b8e07-fff9-9356-306d-000000000027 8975 1727204045.92232: variable 'ansible_search_path' from source: unknown 8975 1727204045.92241: variable 'ansible_search_path' from source: unknown 8975 1727204045.92291: calling self._execute() 8975 1727204045.92393: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204045.92410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204045.92432: variable 'omit' from source: magic vars 8975 1727204045.92854: variable 'ansible_distribution_major_version' from source: facts 8975 1727204045.92876: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204045.92888: variable 'omit' from source: magic vars 8975 1727204045.93069: variable 'omit' from source: magic vars 8975 1727204045.93077: variable 'network_provider' from source: set_fact 8975 1727204045.93104: variable 'omit' from source: magic vars 8975 1727204045.93153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204045.93201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204045.93228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204045.93251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204045.93269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204045.93310: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204045.93320: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204045.93327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204045.93443: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204045.93451: Set connection var ansible_connection to ssh 8975 1727204045.93460: Set connection var ansible_shell_executable to /bin/sh 8975 1727204045.93473: Set connection var ansible_timeout to 10 8975 1727204045.93481: Set connection var ansible_shell_type to sh 8975 1727204045.93504: Set connection var ansible_pipelining to False 8975 1727204045.93533: variable 'ansible_shell_executable' from source: unknown 8975 1727204045.93541: variable 'ansible_connection' from source: unknown 8975 1727204045.93547: variable 'ansible_module_compression' from source: unknown 8975 1727204045.93553: variable 'ansible_shell_type' from source: unknown 8975 1727204045.93559: variable 'ansible_shell_executable' from source: unknown 8975 1727204045.93565: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204045.93574: variable 'ansible_pipelining' from source: unknown 8975 1727204045.93581: variable 'ansible_timeout' from source: unknown 8975 1727204045.93608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204045.93777: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204045.93825: variable 'omit' from source: magic vars 8975 1727204045.93829: starting attempt loop 8975 1727204045.93831: running the handler 8975 1727204045.93875: handler run complete 8975 1727204045.93896: attempt loop complete, returning result 8975 1727204045.93904: _execute() done 8975 1727204045.93933: dumping result to json 8975 1727204045.93937: done dumping result, returning 8975 1727204045.93939: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-9356-306d-000000000027] 8975 1727204045.93948: sending task result for task 127b8e07-fff9-9356-306d-000000000027 8975 1727204045.94115: done sending task result for task 127b8e07-fff9-9356-306d-000000000027 8975 1727204045.94119: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 8975 1727204045.94211: no more pending results, returning what we have 8975 1727204045.94215: results queue empty 8975 1727204045.94216: checking for any_errors_fatal 8975 1727204045.94227: done checking for any_errors_fatal 8975 1727204045.94228: checking for max_fail_percentage 8975 1727204045.94229: done checking for max_fail_percentage 8975 1727204045.94230: checking to see if all hosts have failed and the running result is not ok 8975 1727204045.94232: done checking to see if all hosts have failed 8975 1727204045.94233: getting the remaining hosts for this loop 8975 1727204045.94235: done getting the remaining hosts for this loop 8975 1727204045.94240: getting the next task for host managed-node2 8975 1727204045.94251: done getting next task for host managed-node2 8975 1727204045.94257: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8975 1727204045.94260: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204045.94275: getting variables 8975 1727204045.94277: in VariableManager get_vars() 8975 1727204045.94327: Calling all_inventory to load vars for managed-node2 8975 1727204045.94330: Calling groups_inventory to load vars for managed-node2 8975 1727204045.94332: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204045.94345: Calling all_plugins_play to load vars for managed-node2 8975 1727204045.94348: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204045.94351: Calling groups_plugins_play to load vars for managed-node2 8975 1727204045.96283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204045.98474: done with get_vars() 8975 1727204045.98520: done getting variables 8975 1727204045.98623: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:05 -0400 (0:00:00.072) 0:00:17.303 ***** 8975 1727204045.98661: entering _queue_task() for managed-node2/fail 8975 1727204045.98663: Creating lock for fail 8975 1727204045.99045: worker is 1 (out of 1 available) 8975 1727204045.99060: exiting _queue_task() for managed-node2/fail 8975 1727204045.99275: done queuing things up, now waiting for results queue to drain 8975 1727204045.99278: waiting for pending results... 8975 1727204045.99410: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8975 1727204045.99614: in run() - task 127b8e07-fff9-9356-306d-000000000028 8975 1727204045.99618: variable 'ansible_search_path' from source: unknown 8975 1727204045.99621: variable 'ansible_search_path' from source: unknown 8975 1727204045.99624: calling self._execute() 8975 1727204045.99731: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204045.99745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204045.99760: variable 'omit' from source: magic vars 8975 1727204046.00206: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.00224: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.00378: variable 'network_state' from source: role '' defaults 8975 1727204046.00391: Evaluated conditional (network_state != {}): False 8975 1727204046.00495: when evaluation is False, skipping this task 8975 1727204046.00498: _execute() done 8975 1727204046.00500: dumping result to json 8975 1727204046.00503: done dumping result, returning 8975 1727204046.00506: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-9356-306d-000000000028] 8975 1727204046.00508: sending task result for task 127b8e07-fff9-9356-306d-000000000028 8975 1727204046.00593: done sending task result for task 127b8e07-fff9-9356-306d-000000000028 8975 1727204046.00667: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204046.00723: no more pending results, returning what we have 8975 1727204046.00729: results queue empty 8975 1727204046.00731: checking for any_errors_fatal 8975 1727204046.00737: done checking for any_errors_fatal 8975 1727204046.00738: checking for max_fail_percentage 8975 1727204046.00740: done checking for max_fail_percentage 8975 1727204046.00741: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.00742: done checking to see if all hosts have failed 8975 1727204046.00743: getting the remaining hosts for this loop 8975 1727204046.00746: done getting the remaining hosts for this loop 8975 1727204046.00750: getting the next task for host managed-node2 8975 1727204046.00759: done getting next task for host managed-node2 8975 1727204046.00763: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8975 1727204046.00769: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.00787: getting variables 8975 1727204046.00789: in VariableManager get_vars() 8975 1727204046.00842: Calling all_inventory to load vars for managed-node2 8975 1727204046.00845: Calling groups_inventory to load vars for managed-node2 8975 1727204046.00846: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.00861: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.00864: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.01067: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.02909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.05038: done with get_vars() 8975 1727204046.05079: done getting variables 8975 1727204046.05145: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.065) 0:00:17.368 ***** 8975 1727204046.05183: entering _queue_task() for managed-node2/fail 8975 1727204046.05551: worker is 1 (out of 1 available) 8975 1727204046.05567: exiting _queue_task() for managed-node2/fail 8975 1727204046.05580: done queuing things up, now waiting for results queue to drain 8975 1727204046.05582: waiting for pending results... 8975 1727204046.05991: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8975 1727204046.06028: in run() - task 127b8e07-fff9-9356-306d-000000000029 8975 1727204046.06050: variable 'ansible_search_path' from source: unknown 8975 1727204046.06059: variable 'ansible_search_path' from source: unknown 8975 1727204046.06111: calling self._execute() 8975 1727204046.06210: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.06225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.06239: variable 'omit' from source: magic vars 8975 1727204046.06652: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.06671: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.06806: variable 'network_state' from source: role '' defaults 8975 1727204046.06825: Evaluated conditional (network_state != {}): False 8975 1727204046.06832: when evaluation is False, skipping this task 8975 1727204046.06843: _execute() done 8975 1727204046.06851: dumping result to json 8975 1727204046.06858: done dumping result, returning 8975 1727204046.06873: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-9356-306d-000000000029] 8975 1727204046.06886: sending task result for task 127b8e07-fff9-9356-306d-000000000029 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204046.07119: no more pending results, returning what we have 8975 1727204046.07123: results queue empty 8975 1727204046.07124: checking for any_errors_fatal 8975 1727204046.07134: done checking for any_errors_fatal 8975 1727204046.07135: checking for max_fail_percentage 8975 1727204046.07137: done checking for max_fail_percentage 8975 1727204046.07138: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.07140: done checking to see if all hosts have failed 8975 1727204046.07140: getting the remaining hosts for this loop 8975 1727204046.07143: done getting the remaining hosts for this loop 8975 1727204046.07147: getting the next task for host managed-node2 8975 1727204046.07156: done getting next task for host managed-node2 8975 1727204046.07161: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8975 1727204046.07164: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.07185: getting variables 8975 1727204046.07186: in VariableManager get_vars() 8975 1727204046.07238: Calling all_inventory to load vars for managed-node2 8975 1727204046.07241: Calling groups_inventory to load vars for managed-node2 8975 1727204046.07244: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.07259: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.07263: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.07472: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.08089: done sending task result for task 127b8e07-fff9-9356-306d-000000000029 8975 1727204046.08094: WORKER PROCESS EXITING 8975 1727204046.09482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.11719: done with get_vars() 8975 1727204046.11749: done getting variables 8975 1727204046.11801: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.066) 0:00:17.435 ***** 8975 1727204046.11838: entering _queue_task() for managed-node2/fail 8975 1727204046.12114: worker is 1 (out of 1 available) 8975 1727204046.12130: exiting _queue_task() for managed-node2/fail 8975 1727204046.12143: done queuing things up, now waiting for results queue to drain 8975 1727204046.12144: waiting for pending results... 8975 1727204046.12336: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8975 1727204046.12439: in run() - task 127b8e07-fff9-9356-306d-00000000002a 8975 1727204046.12450: variable 'ansible_search_path' from source: unknown 8975 1727204046.12454: variable 'ansible_search_path' from source: unknown 8975 1727204046.12492: calling self._execute() 8975 1727204046.12562: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.12568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.12577: variable 'omit' from source: magic vars 8975 1727204046.12909: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.12920: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.13066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204046.15676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204046.15681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204046.15684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204046.15687: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204046.15689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204046.15784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.15812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.15846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.15889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.15905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.15996: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.16011: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8975 1727204046.16114: variable 'ansible_distribution' from source: facts 8975 1727204046.16117: variable '__network_rh_distros' from source: role '' defaults 8975 1727204046.16125: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8975 1727204046.16135: when evaluation is False, skipping this task 8975 1727204046.16140: _execute() done 8975 1727204046.16143: dumping result to json 8975 1727204046.16146: done dumping result, returning 8975 1727204046.16149: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-9356-306d-00000000002a] 8975 1727204046.16156: sending task result for task 127b8e07-fff9-9356-306d-00000000002a 8975 1727204046.16256: done sending task result for task 127b8e07-fff9-9356-306d-00000000002a 8975 1727204046.16263: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8975 1727204046.16311: no more pending results, returning what we have 8975 1727204046.16315: results queue empty 8975 1727204046.16316: checking for any_errors_fatal 8975 1727204046.16321: done checking for any_errors_fatal 8975 1727204046.16322: checking for max_fail_percentage 8975 1727204046.16324: done checking for max_fail_percentage 8975 1727204046.16325: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.16326: done checking to see if all hosts have failed 8975 1727204046.16326: getting the remaining hosts for this loop 8975 1727204046.16328: done getting the remaining hosts for this loop 8975 1727204046.16332: getting the next task for host managed-node2 8975 1727204046.16340: done getting next task for host managed-node2 8975 1727204046.16344: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8975 1727204046.16353: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.16375: getting variables 8975 1727204046.16377: in VariableManager get_vars() 8975 1727204046.16420: Calling all_inventory to load vars for managed-node2 8975 1727204046.16422: Calling groups_inventory to load vars for managed-node2 8975 1727204046.16424: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.16433: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.16436: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.16439: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.17451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.19660: done with get_vars() 8975 1727204046.19707: done getting variables 8975 1727204046.19839: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.080) 0:00:17.515 ***** 8975 1727204046.19881: entering _queue_task() for managed-node2/dnf 8975 1727204046.20175: worker is 1 (out of 1 available) 8975 1727204046.20191: exiting _queue_task() for managed-node2/dnf 8975 1727204046.20203: done queuing things up, now waiting for results queue to drain 8975 1727204046.20204: waiting for pending results... 8975 1727204046.20390: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8975 1727204046.20482: in run() - task 127b8e07-fff9-9356-306d-00000000002b 8975 1727204046.20496: variable 'ansible_search_path' from source: unknown 8975 1727204046.20500: variable 'ansible_search_path' from source: unknown 8975 1727204046.20534: calling self._execute() 8975 1727204046.20607: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.20611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.20620: variable 'omit' from source: magic vars 8975 1727204046.20923: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.20934: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.21090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204046.24103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204046.24160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204046.24213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204046.24372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204046.24375: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204046.24410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.24451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.24495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.24549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.24574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.24730: variable 'ansible_distribution' from source: facts 8975 1727204046.24741: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.24754: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8975 1727204046.24899: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204046.25076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.25108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.25155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.25257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.25263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.25288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.25318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.25353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.25475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.25480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.25487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.25518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.25552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.25612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.25634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.25912: variable 'network_connections' from source: task vars 8975 1727204046.25917: variable 'controller_profile' from source: play vars 8975 1727204046.25948: variable 'controller_profile' from source: play vars 8975 1727204046.25963: variable 'controller_device' from source: play vars 8975 1727204046.26050: variable 'controller_device' from source: play vars 8975 1727204046.26068: variable 'port1_profile' from source: play vars 8975 1727204046.26148: variable 'port1_profile' from source: play vars 8975 1727204046.26239: variable 'dhcp_interface1' from source: play vars 8975 1727204046.26243: variable 'dhcp_interface1' from source: play vars 8975 1727204046.26248: variable 'controller_profile' from source: play vars 8975 1727204046.26317: variable 'controller_profile' from source: play vars 8975 1727204046.26334: variable 'port2_profile' from source: play vars 8975 1727204046.26413: variable 'port2_profile' from source: play vars 8975 1727204046.26425: variable 'dhcp_interface2' from source: play vars 8975 1727204046.26502: variable 'dhcp_interface2' from source: play vars 8975 1727204046.26566: variable 'controller_profile' from source: play vars 8975 1727204046.26594: variable 'controller_profile' from source: play vars 8975 1727204046.26696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204046.26917: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204046.26989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204046.27024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204046.27069: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204046.27104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204046.27129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204046.27148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.27169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204046.27225: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204046.27412: variable 'network_connections' from source: task vars 8975 1727204046.27416: variable 'controller_profile' from source: play vars 8975 1727204046.27467: variable 'controller_profile' from source: play vars 8975 1727204046.27474: variable 'controller_device' from source: play vars 8975 1727204046.27520: variable 'controller_device' from source: play vars 8975 1727204046.27530: variable 'port1_profile' from source: play vars 8975 1727204046.27579: variable 'port1_profile' from source: play vars 8975 1727204046.27585: variable 'dhcp_interface1' from source: play vars 8975 1727204046.27633: variable 'dhcp_interface1' from source: play vars 8975 1727204046.27636: variable 'controller_profile' from source: play vars 8975 1727204046.27684: variable 'controller_profile' from source: play vars 8975 1727204046.27691: variable 'port2_profile' from source: play vars 8975 1727204046.27736: variable 'port2_profile' from source: play vars 8975 1727204046.27742: variable 'dhcp_interface2' from source: play vars 8975 1727204046.27791: variable 'dhcp_interface2' from source: play vars 8975 1727204046.27794: variable 'controller_profile' from source: play vars 8975 1727204046.27840: variable 'controller_profile' from source: play vars 8975 1727204046.27869: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204046.27872: when evaluation is False, skipping this task 8975 1727204046.27875: _execute() done 8975 1727204046.27877: dumping result to json 8975 1727204046.27884: done dumping result, returning 8975 1727204046.27890: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-9356-306d-00000000002b] 8975 1727204046.27898: sending task result for task 127b8e07-fff9-9356-306d-00000000002b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204046.28057: no more pending results, returning what we have 8975 1727204046.28060: results queue empty 8975 1727204046.28061: checking for any_errors_fatal 8975 1727204046.28070: done checking for any_errors_fatal 8975 1727204046.28070: checking for max_fail_percentage 8975 1727204046.28072: done checking for max_fail_percentage 8975 1727204046.28073: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.28074: done checking to see if all hosts have failed 8975 1727204046.28075: getting the remaining hosts for this loop 8975 1727204046.28077: done getting the remaining hosts for this loop 8975 1727204046.28081: getting the next task for host managed-node2 8975 1727204046.28089: done getting next task for host managed-node2 8975 1727204046.28093: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8975 1727204046.28097: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.28114: getting variables 8975 1727204046.28115: in VariableManager get_vars() 8975 1727204046.28159: Calling all_inventory to load vars for managed-node2 8975 1727204046.28162: Calling groups_inventory to load vars for managed-node2 8975 1727204046.28164: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.28179: done sending task result for task 127b8e07-fff9-9356-306d-00000000002b 8975 1727204046.28182: WORKER PROCESS EXITING 8975 1727204046.28191: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.28194: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.28196: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.29625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.31059: done with get_vars() 8975 1727204046.31086: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8975 1727204046.31155: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.113) 0:00:17.628 ***** 8975 1727204046.31182: entering _queue_task() for managed-node2/yum 8975 1727204046.31183: Creating lock for yum 8975 1727204046.31473: worker is 1 (out of 1 available) 8975 1727204046.31489: exiting _queue_task() for managed-node2/yum 8975 1727204046.31503: done queuing things up, now waiting for results queue to drain 8975 1727204046.31505: waiting for pending results... 8975 1727204046.31686: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8975 1727204046.31778: in run() - task 127b8e07-fff9-9356-306d-00000000002c 8975 1727204046.31791: variable 'ansible_search_path' from source: unknown 8975 1727204046.31795: variable 'ansible_search_path' from source: unknown 8975 1727204046.31829: calling self._execute() 8975 1727204046.31898: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.31902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.31913: variable 'omit' from source: magic vars 8975 1727204046.32223: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.32234: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.32571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204046.34883: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204046.34963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204046.35011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204046.35052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204046.35099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204046.35190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.35233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.35310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.35358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.35381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.35491: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.35513: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8975 1727204046.35521: when evaluation is False, skipping this task 8975 1727204046.35528: _execute() done 8975 1727204046.35535: dumping result to json 8975 1727204046.35543: done dumping result, returning 8975 1727204046.35555: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-9356-306d-00000000002c] 8975 1727204046.35567: sending task result for task 127b8e07-fff9-9356-306d-00000000002c skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8975 1727204046.35749: no more pending results, returning what we have 8975 1727204046.35752: results queue empty 8975 1727204046.35752: checking for any_errors_fatal 8975 1727204046.35757: done checking for any_errors_fatal 8975 1727204046.35758: checking for max_fail_percentage 8975 1727204046.35760: done checking for max_fail_percentage 8975 1727204046.35761: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.35762: done checking to see if all hosts have failed 8975 1727204046.35763: getting the remaining hosts for this loop 8975 1727204046.35765: done getting the remaining hosts for this loop 8975 1727204046.35771: getting the next task for host managed-node2 8975 1727204046.35780: done getting next task for host managed-node2 8975 1727204046.35784: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8975 1727204046.35786: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.35806: getting variables 8975 1727204046.35808: in VariableManager get_vars() 8975 1727204046.35854: Calling all_inventory to load vars for managed-node2 8975 1727204046.35857: Calling groups_inventory to load vars for managed-node2 8975 1727204046.35859: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.35990: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.35994: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.35999: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.36551: done sending task result for task 127b8e07-fff9-9356-306d-00000000002c 8975 1727204046.36560: WORKER PROCESS EXITING 8975 1727204046.38581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.41501: done with get_vars() 8975 1727204046.41538: done getting variables 8975 1727204046.41626: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.104) 0:00:17.733 ***** 8975 1727204046.41663: entering _queue_task() for managed-node2/fail 8975 1727204046.42531: worker is 1 (out of 1 available) 8975 1727204046.42548: exiting _queue_task() for managed-node2/fail 8975 1727204046.42562: done queuing things up, now waiting for results queue to drain 8975 1727204046.42564: waiting for pending results... 8975 1727204046.43259: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8975 1727204046.43953: in run() - task 127b8e07-fff9-9356-306d-00000000002d 8975 1727204046.44003: variable 'ansible_search_path' from source: unknown 8975 1727204046.44006: variable 'ansible_search_path' from source: unknown 8975 1727204046.44036: calling self._execute() 8975 1727204046.44156: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.44171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.44220: variable 'omit' from source: magic vars 8975 1727204046.44636: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.44674: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.44818: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204046.45115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204046.48779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204046.48872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204046.48881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204046.48884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204046.48931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204046.49036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.49077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.49213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.49223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.49226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.49258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.49293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.49323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.49375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.49415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.49534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.49540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.49544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.49596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.49647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.49825: variable 'network_connections' from source: task vars 8975 1727204046.49844: variable 'controller_profile' from source: play vars 8975 1727204046.50447: variable 'controller_profile' from source: play vars 8975 1727204046.50452: variable 'controller_device' from source: play vars 8975 1727204046.50455: variable 'controller_device' from source: play vars 8975 1727204046.50458: variable 'port1_profile' from source: play vars 8975 1727204046.50835: variable 'port1_profile' from source: play vars 8975 1727204046.50852: variable 'dhcp_interface1' from source: play vars 8975 1727204046.51124: variable 'dhcp_interface1' from source: play vars 8975 1727204046.51143: variable 'controller_profile' from source: play vars 8975 1727204046.51666: variable 'controller_profile' from source: play vars 8975 1727204046.51672: variable 'port2_profile' from source: play vars 8975 1727204046.51674: variable 'port2_profile' from source: play vars 8975 1727204046.51676: variable 'dhcp_interface2' from source: play vars 8975 1727204046.52005: variable 'dhcp_interface2' from source: play vars 8975 1727204046.52171: variable 'controller_profile' from source: play vars 8975 1727204046.52175: variable 'controller_profile' from source: play vars 8975 1727204046.52665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204046.53173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204046.53269: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204046.53408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204046.53683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204046.53724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204046.53872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204046.53912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.53996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204046.54300: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204046.54858: variable 'network_connections' from source: task vars 8975 1727204046.54874: variable 'controller_profile' from source: play vars 8975 1727204046.55106: variable 'controller_profile' from source: play vars 8975 1727204046.55110: variable 'controller_device' from source: play vars 8975 1727204046.55155: variable 'controller_device' from source: play vars 8975 1727204046.55273: variable 'port1_profile' from source: play vars 8975 1727204046.55420: variable 'port1_profile' from source: play vars 8975 1727204046.55424: variable 'dhcp_interface1' from source: play vars 8975 1727204046.55492: variable 'dhcp_interface1' from source: play vars 8975 1727204046.55615: variable 'controller_profile' from source: play vars 8975 1727204046.55685: variable 'controller_profile' from source: play vars 8975 1727204046.55700: variable 'port2_profile' from source: play vars 8975 1727204046.55918: variable 'port2_profile' from source: play vars 8975 1727204046.55922: variable 'dhcp_interface2' from source: play vars 8975 1727204046.55949: variable 'dhcp_interface2' from source: play vars 8975 1727204046.55956: variable 'controller_profile' from source: play vars 8975 1727204046.56229: variable 'controller_profile' from source: play vars 8975 1727204046.56268: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204046.56272: when evaluation is False, skipping this task 8975 1727204046.56274: _execute() done 8975 1727204046.56277: dumping result to json 8975 1727204046.56353: done dumping result, returning 8975 1727204046.56356: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-9356-306d-00000000002d] 8975 1727204046.56359: sending task result for task 127b8e07-fff9-9356-306d-00000000002d 8975 1727204046.56437: done sending task result for task 127b8e07-fff9-9356-306d-00000000002d 8975 1727204046.56441: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204046.56504: no more pending results, returning what we have 8975 1727204046.56508: results queue empty 8975 1727204046.56509: checking for any_errors_fatal 8975 1727204046.56516: done checking for any_errors_fatal 8975 1727204046.56517: checking for max_fail_percentage 8975 1727204046.56519: done checking for max_fail_percentage 8975 1727204046.56520: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.56521: done checking to see if all hosts have failed 8975 1727204046.56522: getting the remaining hosts for this loop 8975 1727204046.56524: done getting the remaining hosts for this loop 8975 1727204046.56529: getting the next task for host managed-node2 8975 1727204046.56537: done getting next task for host managed-node2 8975 1727204046.56541: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8975 1727204046.56544: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.56559: getting variables 8975 1727204046.56561: in VariableManager get_vars() 8975 1727204046.56609: Calling all_inventory to load vars for managed-node2 8975 1727204046.56613: Calling groups_inventory to load vars for managed-node2 8975 1727204046.56615: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.56628: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.56631: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.56635: Calling groups_plugins_play to load vars for managed-node2 8975 1727204046.59876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204046.62250: done with get_vars() 8975 1727204046.62293: done getting variables 8975 1727204046.62358: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.207) 0:00:17.940 ***** 8975 1727204046.62398: entering _queue_task() for managed-node2/package 8975 1727204046.62943: worker is 1 (out of 1 available) 8975 1727204046.62971: exiting _queue_task() for managed-node2/package 8975 1727204046.62987: done queuing things up, now waiting for results queue to drain 8975 1727204046.62989: waiting for pending results... 8975 1727204046.63694: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 8975 1727204046.63868: in run() - task 127b8e07-fff9-9356-306d-00000000002e 8975 1727204046.64102: variable 'ansible_search_path' from source: unknown 8975 1727204046.64106: variable 'ansible_search_path' from source: unknown 8975 1727204046.64110: calling self._execute() 8975 1727204046.64317: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204046.64372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204046.64379: variable 'omit' from source: magic vars 8975 1727204046.65150: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.65177: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204046.66194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204046.66925: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204046.67378: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204046.67382: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204046.67387: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204046.67903: variable 'network_packages' from source: role '' defaults 8975 1727204046.68242: variable '__network_provider_setup' from source: role '' defaults 8975 1727204046.68287: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204046.68452: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204046.68671: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204046.68677: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204046.70068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204046.83292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204046.83437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204046.83510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204046.83580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204046.83644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204046.83741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.83857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.83861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.83863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.83885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.83948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.83985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.84017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.84074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.84095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.84373: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8975 1727204046.84515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.84548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.84587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.84730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.84734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.84780: variable 'ansible_python' from source: facts 8975 1727204046.84813: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8975 1727204046.84930: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204046.85024: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204046.85191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.85224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.85261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.85316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.85340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.85472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204046.85484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204046.85487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.85602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204046.85605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204046.85740: variable 'network_connections' from source: task vars 8975 1727204046.85753: variable 'controller_profile' from source: play vars 8975 1727204046.85874: variable 'controller_profile' from source: play vars 8975 1727204046.85891: variable 'controller_device' from source: play vars 8975 1727204046.86018: variable 'controller_device' from source: play vars 8975 1727204046.86044: variable 'port1_profile' from source: play vars 8975 1727204046.86171: variable 'port1_profile' from source: play vars 8975 1727204046.86192: variable 'dhcp_interface1' from source: play vars 8975 1727204046.86316: variable 'dhcp_interface1' from source: play vars 8975 1727204046.86334: variable 'controller_profile' from source: play vars 8975 1727204046.86582: variable 'controller_profile' from source: play vars 8975 1727204046.86694: variable 'port2_profile' from source: play vars 8975 1727204046.86891: variable 'port2_profile' from source: play vars 8975 1727204046.86912: variable 'dhcp_interface2' from source: play vars 8975 1727204046.87027: variable 'dhcp_interface2' from source: play vars 8975 1727204046.87044: variable 'controller_profile' from source: play vars 8975 1727204046.87160: variable 'controller_profile' from source: play vars 8975 1727204046.87257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204046.87299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204046.87343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204046.87452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204046.87455: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204046.87821: variable 'network_connections' from source: task vars 8975 1727204046.87836: variable 'controller_profile' from source: play vars 8975 1727204046.87955: variable 'controller_profile' from source: play vars 8975 1727204046.87996: variable 'controller_device' from source: play vars 8975 1727204046.88187: variable 'controller_device' from source: play vars 8975 1727204046.88190: variable 'port1_profile' from source: play vars 8975 1727204046.88271: variable 'port1_profile' from source: play vars 8975 1727204046.88288: variable 'dhcp_interface1' from source: play vars 8975 1727204046.88435: variable 'dhcp_interface1' from source: play vars 8975 1727204046.88450: variable 'controller_profile' from source: play vars 8975 1727204046.88584: variable 'controller_profile' from source: play vars 8975 1727204046.88600: variable 'port2_profile' from source: play vars 8975 1727204046.88719: variable 'port2_profile' from source: play vars 8975 1727204046.88741: variable 'dhcp_interface2' from source: play vars 8975 1727204046.88856: variable 'dhcp_interface2' from source: play vars 8975 1727204046.88873: variable 'controller_profile' from source: play vars 8975 1727204046.88987: variable 'controller_profile' from source: play vars 8975 1727204046.89055: variable '__network_packages_default_wireless' from source: role '' defaults 8975 1727204046.89154: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204046.89525: variable 'network_connections' from source: task vars 8975 1727204046.89540: variable 'controller_profile' from source: play vars 8975 1727204046.89672: variable 'controller_profile' from source: play vars 8975 1727204046.89676: variable 'controller_device' from source: play vars 8975 1727204046.89708: variable 'controller_device' from source: play vars 8975 1727204046.89731: variable 'port1_profile' from source: play vars 8975 1727204046.89803: variable 'port1_profile' from source: play vars 8975 1727204046.89821: variable 'dhcp_interface1' from source: play vars 8975 1727204046.89898: variable 'dhcp_interface1' from source: play vars 8975 1727204046.89909: variable 'controller_profile' from source: play vars 8975 1727204046.89986: variable 'controller_profile' from source: play vars 8975 1727204046.90042: variable 'port2_profile' from source: play vars 8975 1727204046.90076: variable 'port2_profile' from source: play vars 8975 1727204046.90090: variable 'dhcp_interface2' from source: play vars 8975 1727204046.90169: variable 'dhcp_interface2' from source: play vars 8975 1727204046.90182: variable 'controller_profile' from source: play vars 8975 1727204046.90260: variable 'controller_profile' from source: play vars 8975 1727204046.90300: variable '__network_packages_default_team' from source: role '' defaults 8975 1727204046.90471: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204046.90964: variable 'network_connections' from source: task vars 8975 1727204046.90984: variable 'controller_profile' from source: play vars 8975 1727204046.91098: variable 'controller_profile' from source: play vars 8975 1727204046.91102: variable 'controller_device' from source: play vars 8975 1727204046.91162: variable 'controller_device' from source: play vars 8975 1727204046.91183: variable 'port1_profile' from source: play vars 8975 1727204046.91260: variable 'port1_profile' from source: play vars 8975 1727204046.91276: variable 'dhcp_interface1' from source: play vars 8975 1727204046.91371: variable 'dhcp_interface1' from source: play vars 8975 1727204046.91375: variable 'controller_profile' from source: play vars 8975 1727204046.91438: variable 'controller_profile' from source: play vars 8975 1727204046.91452: variable 'port2_profile' from source: play vars 8975 1727204046.91571: variable 'port2_profile' from source: play vars 8975 1727204046.91574: variable 'dhcp_interface2' from source: play vars 8975 1727204046.91608: variable 'dhcp_interface2' from source: play vars 8975 1727204046.91625: variable 'controller_profile' from source: play vars 8975 1727204046.91698: variable 'controller_profile' from source: play vars 8975 1727204046.91781: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204046.91854: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204046.91870: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204046.91953: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204046.92240: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8975 1727204046.92863: variable 'network_connections' from source: task vars 8975 1727204046.92937: variable 'controller_profile' from source: play vars 8975 1727204046.92968: variable 'controller_profile' from source: play vars 8975 1727204046.92988: variable 'controller_device' from source: play vars 8975 1727204046.93063: variable 'controller_device' from source: play vars 8975 1727204046.93086: variable 'port1_profile' from source: play vars 8975 1727204046.93164: variable 'port1_profile' from source: play vars 8975 1727204046.93179: variable 'dhcp_interface1' from source: play vars 8975 1727204046.93264: variable 'dhcp_interface1' from source: play vars 8975 1727204046.93271: variable 'controller_profile' from source: play vars 8975 1727204046.93373: variable 'controller_profile' from source: play vars 8975 1727204046.93376: variable 'port2_profile' from source: play vars 8975 1727204046.93408: variable 'port2_profile' from source: play vars 8975 1727204046.93420: variable 'dhcp_interface2' from source: play vars 8975 1727204046.93487: variable 'dhcp_interface2' from source: play vars 8975 1727204046.93499: variable 'controller_profile' from source: play vars 8975 1727204046.93560: variable 'controller_profile' from source: play vars 8975 1727204046.93576: variable 'ansible_distribution' from source: facts 8975 1727204046.93591: variable '__network_rh_distros' from source: role '' defaults 8975 1727204046.93771: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.93774: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8975 1727204046.93809: variable 'ansible_distribution' from source: facts 8975 1727204046.93817: variable '__network_rh_distros' from source: role '' defaults 8975 1727204046.93825: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.93838: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8975 1727204046.94011: variable 'ansible_distribution' from source: facts 8975 1727204046.94019: variable '__network_rh_distros' from source: role '' defaults 8975 1727204046.94030: variable 'ansible_distribution_major_version' from source: facts 8975 1727204046.94070: variable 'network_provider' from source: set_fact 8975 1727204046.94090: variable 'ansible_facts' from source: unknown 8975 1727204046.95058: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8975 1727204046.95102: when evaluation is False, skipping this task 8975 1727204046.95110: _execute() done 8975 1727204046.95117: dumping result to json 8975 1727204046.95123: done dumping result, returning 8975 1727204046.95167: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-9356-306d-00000000002e] 8975 1727204046.95177: sending task result for task 127b8e07-fff9-9356-306d-00000000002e skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8975 1727204046.95511: no more pending results, returning what we have 8975 1727204046.95515: results queue empty 8975 1727204046.95515: checking for any_errors_fatal 8975 1727204046.95523: done checking for any_errors_fatal 8975 1727204046.95524: checking for max_fail_percentage 8975 1727204046.95525: done checking for max_fail_percentage 8975 1727204046.95526: checking to see if all hosts have failed and the running result is not ok 8975 1727204046.95527: done checking to see if all hosts have failed 8975 1727204046.95528: getting the remaining hosts for this loop 8975 1727204046.95530: done getting the remaining hosts for this loop 8975 1727204046.95534: getting the next task for host managed-node2 8975 1727204046.95545: done getting next task for host managed-node2 8975 1727204046.95550: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8975 1727204046.95555: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204046.95575: getting variables 8975 1727204046.95577: in VariableManager get_vars() 8975 1727204046.95627: Calling all_inventory to load vars for managed-node2 8975 1727204046.95630: Calling groups_inventory to load vars for managed-node2 8975 1727204046.95632: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204046.95639: done sending task result for task 127b8e07-fff9-9356-306d-00000000002e 8975 1727204046.95642: WORKER PROCESS EXITING 8975 1727204046.95884: Calling all_plugins_play to load vars for managed-node2 8975 1727204046.95905: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204046.95910: Calling groups_plugins_play to load vars for managed-node2 8975 1727204047.03729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204047.05740: done with get_vars() 8975 1727204047.05771: done getting variables 8975 1727204047.05833: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.434) 0:00:18.375 ***** 8975 1727204047.05859: entering _queue_task() for managed-node2/package 8975 1727204047.06201: worker is 1 (out of 1 available) 8975 1727204047.06217: exiting _queue_task() for managed-node2/package 8975 1727204047.06232: done queuing things up, now waiting for results queue to drain 8975 1727204047.06234: waiting for pending results... 8975 1727204047.06456: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8975 1727204047.06603: in run() - task 127b8e07-fff9-9356-306d-00000000002f 8975 1727204047.06628: variable 'ansible_search_path' from source: unknown 8975 1727204047.06632: variable 'ansible_search_path' from source: unknown 8975 1727204047.06696: calling self._execute() 8975 1727204047.06783: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.06787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.06799: variable 'omit' from source: magic vars 8975 1727204047.07296: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.07300: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204047.07376: variable 'network_state' from source: role '' defaults 8975 1727204047.07390: Evaluated conditional (network_state != {}): False 8975 1727204047.07393: when evaluation is False, skipping this task 8975 1727204047.07396: _execute() done 8975 1727204047.07399: dumping result to json 8975 1727204047.07401: done dumping result, returning 8975 1727204047.07404: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-9356-306d-00000000002f] 8975 1727204047.07433: sending task result for task 127b8e07-fff9-9356-306d-00000000002f 8975 1727204047.07523: done sending task result for task 127b8e07-fff9-9356-306d-00000000002f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204047.07595: no more pending results, returning what we have 8975 1727204047.07600: results queue empty 8975 1727204047.07600: checking for any_errors_fatal 8975 1727204047.07609: done checking for any_errors_fatal 8975 1727204047.07610: checking for max_fail_percentage 8975 1727204047.07612: done checking for max_fail_percentage 8975 1727204047.07613: checking to see if all hosts have failed and the running result is not ok 8975 1727204047.07614: done checking to see if all hosts have failed 8975 1727204047.07615: getting the remaining hosts for this loop 8975 1727204047.07617: done getting the remaining hosts for this loop 8975 1727204047.07622: getting the next task for host managed-node2 8975 1727204047.07631: done getting next task for host managed-node2 8975 1727204047.07635: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8975 1727204047.07640: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204047.07656: getting variables 8975 1727204047.07657: in VariableManager get_vars() 8975 1727204047.07705: Calling all_inventory to load vars for managed-node2 8975 1727204047.07708: Calling groups_inventory to load vars for managed-node2 8975 1727204047.07710: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204047.07720: Calling all_plugins_play to load vars for managed-node2 8975 1727204047.07722: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204047.07725: Calling groups_plugins_play to load vars for managed-node2 8975 1727204047.08273: WORKER PROCESS EXITING 8975 1727204047.09371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204047.10776: done with get_vars() 8975 1727204047.10804: done getting variables 8975 1727204047.10878: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.050) 0:00:18.426 ***** 8975 1727204047.10914: entering _queue_task() for managed-node2/package 8975 1727204047.11228: worker is 1 (out of 1 available) 8975 1727204047.11246: exiting _queue_task() for managed-node2/package 8975 1727204047.11260: done queuing things up, now waiting for results queue to drain 8975 1727204047.11262: waiting for pending results... 8975 1727204047.11494: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8975 1727204047.11614: in run() - task 127b8e07-fff9-9356-306d-000000000030 8975 1727204047.11625: variable 'ansible_search_path' from source: unknown 8975 1727204047.11631: variable 'ansible_search_path' from source: unknown 8975 1727204047.11667: calling self._execute() 8975 1727204047.11749: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.11759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.11770: variable 'omit' from source: magic vars 8975 1727204047.12106: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.12116: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204047.12208: variable 'network_state' from source: role '' defaults 8975 1727204047.12216: Evaluated conditional (network_state != {}): False 8975 1727204047.12220: when evaluation is False, skipping this task 8975 1727204047.12223: _execute() done 8975 1727204047.12225: dumping result to json 8975 1727204047.12231: done dumping result, returning 8975 1727204047.12237: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-9356-306d-000000000030] 8975 1727204047.12242: sending task result for task 127b8e07-fff9-9356-306d-000000000030 8975 1727204047.12347: done sending task result for task 127b8e07-fff9-9356-306d-000000000030 8975 1727204047.12351: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204047.12402: no more pending results, returning what we have 8975 1727204047.12405: results queue empty 8975 1727204047.12406: checking for any_errors_fatal 8975 1727204047.12416: done checking for any_errors_fatal 8975 1727204047.12417: checking for max_fail_percentage 8975 1727204047.12419: done checking for max_fail_percentage 8975 1727204047.12419: checking to see if all hosts have failed and the running result is not ok 8975 1727204047.12421: done checking to see if all hosts have failed 8975 1727204047.12422: getting the remaining hosts for this loop 8975 1727204047.12424: done getting the remaining hosts for this loop 8975 1727204047.12430: getting the next task for host managed-node2 8975 1727204047.12437: done getting next task for host managed-node2 8975 1727204047.12441: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8975 1727204047.12444: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204047.12460: getting variables 8975 1727204047.12462: in VariableManager get_vars() 8975 1727204047.12503: Calling all_inventory to load vars for managed-node2 8975 1727204047.12506: Calling groups_inventory to load vars for managed-node2 8975 1727204047.12508: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204047.12517: Calling all_plugins_play to load vars for managed-node2 8975 1727204047.12519: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204047.12522: Calling groups_plugins_play to load vars for managed-node2 8975 1727204047.13552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204047.15062: done with get_vars() 8975 1727204047.15091: done getting variables 8975 1727204047.15178: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.042) 0:00:18.468 ***** 8975 1727204047.15205: entering _queue_task() for managed-node2/service 8975 1727204047.15206: Creating lock for service 8975 1727204047.15501: worker is 1 (out of 1 available) 8975 1727204047.15518: exiting _queue_task() for managed-node2/service 8975 1727204047.15531: done queuing things up, now waiting for results queue to drain 8975 1727204047.15532: waiting for pending results... 8975 1727204047.15735: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8975 1727204047.15864: in run() - task 127b8e07-fff9-9356-306d-000000000031 8975 1727204047.15881: variable 'ansible_search_path' from source: unknown 8975 1727204047.15885: variable 'ansible_search_path' from source: unknown 8975 1727204047.15915: calling self._execute() 8975 1727204047.16011: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.16017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.16027: variable 'omit' from source: magic vars 8975 1727204047.16418: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.16430: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204047.16521: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204047.16690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204047.19071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204047.19175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204047.19213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204047.19235: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204047.19259: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204047.19348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.19402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.19429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.19468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.19475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.19520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.19552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.19569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.19599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.19613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.19679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.19715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.19719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.19771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.19779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.19942: variable 'network_connections' from source: task vars 8975 1727204047.19962: variable 'controller_profile' from source: play vars 8975 1727204047.20015: variable 'controller_profile' from source: play vars 8975 1727204047.20041: variable 'controller_device' from source: play vars 8975 1727204047.20101: variable 'controller_device' from source: play vars 8975 1727204047.20105: variable 'port1_profile' from source: play vars 8975 1727204047.20151: variable 'port1_profile' from source: play vars 8975 1727204047.20158: variable 'dhcp_interface1' from source: play vars 8975 1727204047.20205: variable 'dhcp_interface1' from source: play vars 8975 1727204047.20219: variable 'controller_profile' from source: play vars 8975 1727204047.20270: variable 'controller_profile' from source: play vars 8975 1727204047.20277: variable 'port2_profile' from source: play vars 8975 1727204047.20323: variable 'port2_profile' from source: play vars 8975 1727204047.20333: variable 'dhcp_interface2' from source: play vars 8975 1727204047.20376: variable 'dhcp_interface2' from source: play vars 8975 1727204047.20382: variable 'controller_profile' from source: play vars 8975 1727204047.20444: variable 'controller_profile' from source: play vars 8975 1727204047.20524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204047.20670: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204047.20700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204047.20724: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204047.20750: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204047.20789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204047.20806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204047.20825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.20848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204047.20902: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204047.21090: variable 'network_connections' from source: task vars 8975 1727204047.21094: variable 'controller_profile' from source: play vars 8975 1727204047.21142: variable 'controller_profile' from source: play vars 8975 1727204047.21148: variable 'controller_device' from source: play vars 8975 1727204047.21211: variable 'controller_device' from source: play vars 8975 1727204047.21217: variable 'port1_profile' from source: play vars 8975 1727204047.21279: variable 'port1_profile' from source: play vars 8975 1727204047.21282: variable 'dhcp_interface1' from source: play vars 8975 1727204047.21346: variable 'dhcp_interface1' from source: play vars 8975 1727204047.21349: variable 'controller_profile' from source: play vars 8975 1727204047.21419: variable 'controller_profile' from source: play vars 8975 1727204047.21427: variable 'port2_profile' from source: play vars 8975 1727204047.21498: variable 'port2_profile' from source: play vars 8975 1727204047.21501: variable 'dhcp_interface2' from source: play vars 8975 1727204047.21558: variable 'dhcp_interface2' from source: play vars 8975 1727204047.21568: variable 'controller_profile' from source: play vars 8975 1727204047.21617: variable 'controller_profile' from source: play vars 8975 1727204047.21668: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204047.21672: when evaluation is False, skipping this task 8975 1727204047.21675: _execute() done 8975 1727204047.21677: dumping result to json 8975 1727204047.21680: done dumping result, returning 8975 1727204047.21682: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-9356-306d-000000000031] 8975 1727204047.21684: sending task result for task 127b8e07-fff9-9356-306d-000000000031 8975 1727204047.21800: done sending task result for task 127b8e07-fff9-9356-306d-000000000031 8975 1727204047.21803: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204047.21964: no more pending results, returning what we have 8975 1727204047.21969: results queue empty 8975 1727204047.21970: checking for any_errors_fatal 8975 1727204047.21987: done checking for any_errors_fatal 8975 1727204047.21991: checking for max_fail_percentage 8975 1727204047.21993: done checking for max_fail_percentage 8975 1727204047.21994: checking to see if all hosts have failed and the running result is not ok 8975 1727204047.21995: done checking to see if all hosts have failed 8975 1727204047.21996: getting the remaining hosts for this loop 8975 1727204047.21997: done getting the remaining hosts for this loop 8975 1727204047.22001: getting the next task for host managed-node2 8975 1727204047.22008: done getting next task for host managed-node2 8975 1727204047.22013: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8975 1727204047.22015: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204047.22074: getting variables 8975 1727204047.22076: in VariableManager get_vars() 8975 1727204047.22133: Calling all_inventory to load vars for managed-node2 8975 1727204047.22135: Calling groups_inventory to load vars for managed-node2 8975 1727204047.22137: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204047.22145: Calling all_plugins_play to load vars for managed-node2 8975 1727204047.22147: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204047.22149: Calling groups_plugins_play to load vars for managed-node2 8975 1727204047.23687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204047.25206: done with get_vars() 8975 1727204047.25236: done getting variables 8975 1727204047.25289: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.101) 0:00:18.570 ***** 8975 1727204047.25317: entering _queue_task() for managed-node2/service 8975 1727204047.25599: worker is 1 (out of 1 available) 8975 1727204047.25615: exiting _queue_task() for managed-node2/service 8975 1727204047.25630: done queuing things up, now waiting for results queue to drain 8975 1727204047.25632: waiting for pending results... 8975 1727204047.25828: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8975 1727204047.25947: in run() - task 127b8e07-fff9-9356-306d-000000000032 8975 1727204047.25959: variable 'ansible_search_path' from source: unknown 8975 1727204047.25963: variable 'ansible_search_path' from source: unknown 8975 1727204047.26001: calling self._execute() 8975 1727204047.26081: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.26087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.26094: variable 'omit' from source: magic vars 8975 1727204047.26407: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.26422: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204047.26544: variable 'network_provider' from source: set_fact 8975 1727204047.26547: variable 'network_state' from source: role '' defaults 8975 1727204047.26557: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8975 1727204047.26566: variable 'omit' from source: magic vars 8975 1727204047.26604: variable 'omit' from source: magic vars 8975 1727204047.26625: variable 'network_service_name' from source: role '' defaults 8975 1727204047.26723: variable 'network_service_name' from source: role '' defaults 8975 1727204047.26823: variable '__network_provider_setup' from source: role '' defaults 8975 1727204047.26827: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204047.26892: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204047.26913: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204047.26978: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204047.27247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204047.29308: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204047.29387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204047.29419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204047.29466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204047.29490: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204047.29575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.29602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.29623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.29669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.29682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.29721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.29739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.29762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.29800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.29813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.30022: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8975 1727204047.30116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.30133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.30154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.30185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.30196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.30279: variable 'ansible_python' from source: facts 8975 1727204047.30313: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8975 1727204047.30384: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204047.30463: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204047.30578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.30600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.30620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.30650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.30661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.30712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204047.30753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204047.30756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.30787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204047.30797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204047.30919: variable 'network_connections' from source: task vars 8975 1727204047.30927: variable 'controller_profile' from source: play vars 8975 1727204047.30991: variable 'controller_profile' from source: play vars 8975 1727204047.31001: variable 'controller_device' from source: play vars 8975 1727204047.31056: variable 'controller_device' from source: play vars 8975 1727204047.31088: variable 'port1_profile' from source: play vars 8975 1727204047.31138: variable 'port1_profile' from source: play vars 8975 1727204047.31148: variable 'dhcp_interface1' from source: play vars 8975 1727204047.31206: variable 'dhcp_interface1' from source: play vars 8975 1727204047.31216: variable 'controller_profile' from source: play vars 8975 1727204047.31270: variable 'controller_profile' from source: play vars 8975 1727204047.31281: variable 'port2_profile' from source: play vars 8975 1727204047.31340: variable 'port2_profile' from source: play vars 8975 1727204047.31351: variable 'dhcp_interface2' from source: play vars 8975 1727204047.31434: variable 'dhcp_interface2' from source: play vars 8975 1727204047.31449: variable 'controller_profile' from source: play vars 8975 1727204047.31516: variable 'controller_profile' from source: play vars 8975 1727204047.31633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204047.32060: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204047.32105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204047.32171: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204047.32188: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204047.32246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204047.32272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204047.32296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204047.32322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204047.32366: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204047.32603: variable 'network_connections' from source: task vars 8975 1727204047.32609: variable 'controller_profile' from source: play vars 8975 1727204047.32671: variable 'controller_profile' from source: play vars 8975 1727204047.32683: variable 'controller_device' from source: play vars 8975 1727204047.32753: variable 'controller_device' from source: play vars 8975 1727204047.32764: variable 'port1_profile' from source: play vars 8975 1727204047.32822: variable 'port1_profile' from source: play vars 8975 1727204047.32832: variable 'dhcp_interface1' from source: play vars 8975 1727204047.32888: variable 'dhcp_interface1' from source: play vars 8975 1727204047.32898: variable 'controller_profile' from source: play vars 8975 1727204047.32965: variable 'controller_profile' from source: play vars 8975 1727204047.32978: variable 'port2_profile' from source: play vars 8975 1727204047.33033: variable 'port2_profile' from source: play vars 8975 1727204047.33042: variable 'dhcp_interface2' from source: play vars 8975 1727204047.33106: variable 'dhcp_interface2' from source: play vars 8975 1727204047.33113: variable 'controller_profile' from source: play vars 8975 1727204047.33185: variable 'controller_profile' from source: play vars 8975 1727204047.33235: variable '__network_packages_default_wireless' from source: role '' defaults 8975 1727204047.33321: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204047.33571: variable 'network_connections' from source: task vars 8975 1727204047.33575: variable 'controller_profile' from source: play vars 8975 1727204047.33634: variable 'controller_profile' from source: play vars 8975 1727204047.33638: variable 'controller_device' from source: play vars 8975 1727204047.33702: variable 'controller_device' from source: play vars 8975 1727204047.33715: variable 'port1_profile' from source: play vars 8975 1727204047.33763: variable 'port1_profile' from source: play vars 8975 1727204047.33771: variable 'dhcp_interface1' from source: play vars 8975 1727204047.33824: variable 'dhcp_interface1' from source: play vars 8975 1727204047.33830: variable 'controller_profile' from source: play vars 8975 1727204047.33882: variable 'controller_profile' from source: play vars 8975 1727204047.33889: variable 'port2_profile' from source: play vars 8975 1727204047.33963: variable 'port2_profile' from source: play vars 8975 1727204047.33968: variable 'dhcp_interface2' from source: play vars 8975 1727204047.34051: variable 'dhcp_interface2' from source: play vars 8975 1727204047.34055: variable 'controller_profile' from source: play vars 8975 1727204047.34100: variable 'controller_profile' from source: play vars 8975 1727204047.34122: variable '__network_packages_default_team' from source: role '' defaults 8975 1727204047.34184: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204047.34421: variable 'network_connections' from source: task vars 8975 1727204047.34425: variable 'controller_profile' from source: play vars 8975 1727204047.34496: variable 'controller_profile' from source: play vars 8975 1727204047.34503: variable 'controller_device' from source: play vars 8975 1727204047.34572: variable 'controller_device' from source: play vars 8975 1727204047.34581: variable 'port1_profile' from source: play vars 8975 1727204047.34652: variable 'port1_profile' from source: play vars 8975 1727204047.34658: variable 'dhcp_interface1' from source: play vars 8975 1727204047.34716: variable 'dhcp_interface1' from source: play vars 8975 1727204047.34719: variable 'controller_profile' from source: play vars 8975 1727204047.34773: variable 'controller_profile' from source: play vars 8975 1727204047.34780: variable 'port2_profile' from source: play vars 8975 1727204047.34835: variable 'port2_profile' from source: play vars 8975 1727204047.34845: variable 'dhcp_interface2' from source: play vars 8975 1727204047.34894: variable 'dhcp_interface2' from source: play vars 8975 1727204047.34899: variable 'controller_profile' from source: play vars 8975 1727204047.34954: variable 'controller_profile' from source: play vars 8975 1727204047.35006: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204047.35057: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204047.35060: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204047.35108: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204047.35269: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8975 1727204047.35626: variable 'network_connections' from source: task vars 8975 1727204047.35630: variable 'controller_profile' from source: play vars 8975 1727204047.35687: variable 'controller_profile' from source: play vars 8975 1727204047.35694: variable 'controller_device' from source: play vars 8975 1727204047.35742: variable 'controller_device' from source: play vars 8975 1727204047.35750: variable 'port1_profile' from source: play vars 8975 1727204047.35797: variable 'port1_profile' from source: play vars 8975 1727204047.35804: variable 'dhcp_interface1' from source: play vars 8975 1727204047.35850: variable 'dhcp_interface1' from source: play vars 8975 1727204047.35855: variable 'controller_profile' from source: play vars 8975 1727204047.35907: variable 'controller_profile' from source: play vars 8975 1727204047.35914: variable 'port2_profile' from source: play vars 8975 1727204047.35960: variable 'port2_profile' from source: play vars 8975 1727204047.35968: variable 'dhcp_interface2' from source: play vars 8975 1727204047.36014: variable 'dhcp_interface2' from source: play vars 8975 1727204047.36020: variable 'controller_profile' from source: play vars 8975 1727204047.36064: variable 'controller_profile' from source: play vars 8975 1727204047.36075: variable 'ansible_distribution' from source: facts 8975 1727204047.36078: variable '__network_rh_distros' from source: role '' defaults 8975 1727204047.36092: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.36110: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8975 1727204047.36235: variable 'ansible_distribution' from source: facts 8975 1727204047.36238: variable '__network_rh_distros' from source: role '' defaults 8975 1727204047.36244: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.36250: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8975 1727204047.36375: variable 'ansible_distribution' from source: facts 8975 1727204047.36379: variable '__network_rh_distros' from source: role '' defaults 8975 1727204047.36384: variable 'ansible_distribution_major_version' from source: facts 8975 1727204047.36412: variable 'network_provider' from source: set_fact 8975 1727204047.36437: variable 'omit' from source: magic vars 8975 1727204047.36458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204047.36485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204047.36500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204047.36515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204047.36524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204047.36562: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204047.36567: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.36571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.36670: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204047.36673: Set connection var ansible_connection to ssh 8975 1727204047.36680: Set connection var ansible_shell_executable to /bin/sh 8975 1727204047.36686: Set connection var ansible_timeout to 10 8975 1727204047.36688: Set connection var ansible_shell_type to sh 8975 1727204047.36698: Set connection var ansible_pipelining to False 8975 1727204047.36718: variable 'ansible_shell_executable' from source: unknown 8975 1727204047.36722: variable 'ansible_connection' from source: unknown 8975 1727204047.36731: variable 'ansible_module_compression' from source: unknown 8975 1727204047.36734: variable 'ansible_shell_type' from source: unknown 8975 1727204047.36737: variable 'ansible_shell_executable' from source: unknown 8975 1727204047.36739: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204047.36763: variable 'ansible_pipelining' from source: unknown 8975 1727204047.36765: variable 'ansible_timeout' from source: unknown 8975 1727204047.36789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204047.36861: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204047.36871: variable 'omit' from source: magic vars 8975 1727204047.36881: starting attempt loop 8975 1727204047.36891: running the handler 8975 1727204047.36945: variable 'ansible_facts' from source: unknown 8975 1727204047.37667: _low_level_execute_command(): starting 8975 1727204047.37730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204047.38306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204047.38333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.38385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204047.38388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204047.38409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204047.38495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204047.40330: stdout chunk (state=3): >>>/root <<< 8975 1727204047.40434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204047.40516: stderr chunk (state=3): >>><<< 8975 1727204047.40519: stdout chunk (state=3): >>><<< 8975 1727204047.40586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204047.40594: _low_level_execute_command(): starting 8975 1727204047.40597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946 `" && echo ansible-tmp-1727204047.4054353-10833-59050539413946="` echo /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946 `" ) && sleep 0' 8975 1727204047.41103: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204047.41111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.41114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204047.41116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.41170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204047.41188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204047.41277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204047.43270: stdout chunk (state=3): >>>ansible-tmp-1727204047.4054353-10833-59050539413946=/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946 <<< 8975 1727204047.43455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204047.43492: stderr chunk (state=3): >>><<< 8975 1727204047.43495: stdout chunk (state=3): >>><<< 8975 1727204047.43521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204047.4054353-10833-59050539413946=/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204047.43551: variable 'ansible_module_compression' from source: unknown 8975 1727204047.43607: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8975 1727204047.43612: ANSIBALLZ: Acquiring lock 8975 1727204047.43615: ANSIBALLZ: Lock acquired: 140501807209920 8975 1727204047.43617: ANSIBALLZ: Creating module 8975 1727204047.69938: ANSIBALLZ: Writing module into payload 8975 1727204047.70062: ANSIBALLZ: Writing module 8975 1727204047.70093: ANSIBALLZ: Renaming module 8975 1727204047.70099: ANSIBALLZ: Done creating module 8975 1727204047.70136: variable 'ansible_facts' from source: unknown 8975 1727204047.70283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py 8975 1727204047.70426: Sending initial data 8975 1727204047.70432: Sent initial data (154 bytes) 8975 1727204047.71219: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204047.71244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204047.71325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.71374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204047.71396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204047.71499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204047.73246: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204047.73321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204047.73419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpe54hkytu /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py <<< 8975 1727204047.73424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py" <<< 8975 1727204047.73477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpe54hkytu" to remote "/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py" <<< 8975 1727204047.75353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204047.75374: stderr chunk (state=3): >>><<< 8975 1727204047.75448: stdout chunk (state=3): >>><<< 8975 1727204047.75452: done transferring module to remote 8975 1727204047.75454: _low_level_execute_command(): starting 8975 1727204047.75462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/ /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py && sleep 0' 8975 1727204047.76230: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.76327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204047.76384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204047.76387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204047.76470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204047.78732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204047.78737: stdout chunk (state=3): >>><<< 8975 1727204047.78740: stderr chunk (state=3): >>><<< 8975 1727204047.78744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204047.78746: _low_level_execute_command(): starting 8975 1727204047.78749: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/AnsiballZ_systemd.py && sleep 0' 8975 1727204047.80479: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204047.80484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204047.80494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204047.80499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204047.80502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204047.80505: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204047.80507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.80509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204047.80571: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204047.80575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204047.80578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204047.80580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204047.80583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204047.80585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204047.80587: stderr chunk (state=3): >>>debug2: match found <<< 8975 1727204047.80589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204047.80882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204047.80892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204047.81080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204048.12954: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4022272", "MemoryPeak": "4558848", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3494367232", "CPUUsageNSec": "390239000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 8975 1727204048.13012: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 8975 1727204048.13016: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8975 1727204048.14877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204048.14940: stderr chunk (state=3): >>><<< 8975 1727204048.14944: stdout chunk (state=3): >>><<< 8975 1727204048.14961: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4022272", "MemoryPeak": "4558848", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3494367232", "CPUUsageNSec": "390239000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204048.15105: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204048.15122: _low_level_execute_command(): starting 8975 1727204048.15125: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204047.4054353-10833-59050539413946/ > /dev/null 2>&1 && sleep 0' 8975 1727204048.15717: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204048.15721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204048.15723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204048.15746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204048.15750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204048.15794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204048.15797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204048.15808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204048.15902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204048.17788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204048.17853: stderr chunk (state=3): >>><<< 8975 1727204048.17856: stdout chunk (state=3): >>><<< 8975 1727204048.17873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204048.17880: handler run complete 8975 1727204048.17920: attempt loop complete, returning result 8975 1727204048.17924: _execute() done 8975 1727204048.17926: dumping result to json 8975 1727204048.17945: done dumping result, returning 8975 1727204048.17951: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-9356-306d-000000000032] 8975 1727204048.17960: sending task result for task 127b8e07-fff9-9356-306d-000000000032 8975 1727204048.18219: done sending task result for task 127b8e07-fff9-9356-306d-000000000032 8975 1727204048.18222: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204048.18281: no more pending results, returning what we have 8975 1727204048.18284: results queue empty 8975 1727204048.18285: checking for any_errors_fatal 8975 1727204048.18291: done checking for any_errors_fatal 8975 1727204048.18292: checking for max_fail_percentage 8975 1727204048.18294: done checking for max_fail_percentage 8975 1727204048.18294: checking to see if all hosts have failed and the running result is not ok 8975 1727204048.18295: done checking to see if all hosts have failed 8975 1727204048.18296: getting the remaining hosts for this loop 8975 1727204048.18298: done getting the remaining hosts for this loop 8975 1727204048.18302: getting the next task for host managed-node2 8975 1727204048.18308: done getting next task for host managed-node2 8975 1727204048.18312: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8975 1727204048.18314: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204048.18326: getting variables 8975 1727204048.18327: in VariableManager get_vars() 8975 1727204048.18372: Calling all_inventory to load vars for managed-node2 8975 1727204048.18375: Calling groups_inventory to load vars for managed-node2 8975 1727204048.18377: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204048.18387: Calling all_plugins_play to load vars for managed-node2 8975 1727204048.18390: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204048.18392: Calling groups_plugins_play to load vars for managed-node2 8975 1727204048.19617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204048.21021: done with get_vars() 8975 1727204048.21062: done getting variables 8975 1727204048.21130: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.958) 0:00:19.528 ***** 8975 1727204048.21157: entering _queue_task() for managed-node2/service 8975 1727204048.21481: worker is 1 (out of 1 available) 8975 1727204048.21496: exiting _queue_task() for managed-node2/service 8975 1727204048.21512: done queuing things up, now waiting for results queue to drain 8975 1727204048.21514: waiting for pending results... 8975 1727204048.22008: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8975 1727204048.22015: in run() - task 127b8e07-fff9-9356-306d-000000000033 8975 1727204048.22019: variable 'ansible_search_path' from source: unknown 8975 1727204048.22021: variable 'ansible_search_path' from source: unknown 8975 1727204048.22070: calling self._execute() 8975 1727204048.22176: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.22206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.22224: variable 'omit' from source: magic vars 8975 1727204048.22793: variable 'ansible_distribution_major_version' from source: facts 8975 1727204048.22820: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204048.23000: variable 'network_provider' from source: set_fact 8975 1727204048.23030: Evaluated conditional (network_provider == "nm"): True 8975 1727204048.23188: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204048.23300: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204048.23527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204048.25673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204048.25678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204048.25715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204048.25759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204048.25799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204048.25910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204048.25950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204048.25988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204048.26034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204048.26055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204048.26111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204048.26140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204048.26169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204048.26273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204048.26277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204048.26280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204048.26300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204048.26333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204048.26384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204048.26406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204048.26591: variable 'network_connections' from source: task vars 8975 1727204048.26608: variable 'controller_profile' from source: play vars 8975 1727204048.26689: variable 'controller_profile' from source: play vars 8975 1727204048.26706: variable 'controller_device' from source: play vars 8975 1727204048.26870: variable 'controller_device' from source: play vars 8975 1727204048.26873: variable 'port1_profile' from source: play vars 8975 1727204048.26875: variable 'port1_profile' from source: play vars 8975 1727204048.26877: variable 'dhcp_interface1' from source: play vars 8975 1727204048.26922: variable 'dhcp_interface1' from source: play vars 8975 1727204048.26935: variable 'controller_profile' from source: play vars 8975 1727204048.27000: variable 'controller_profile' from source: play vars 8975 1727204048.27013: variable 'port2_profile' from source: play vars 8975 1727204048.27080: variable 'port2_profile' from source: play vars 8975 1727204048.27093: variable 'dhcp_interface2' from source: play vars 8975 1727204048.27158: variable 'dhcp_interface2' from source: play vars 8975 1727204048.27172: variable 'controller_profile' from source: play vars 8975 1727204048.27232: variable 'controller_profile' from source: play vars 8975 1727204048.27319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204048.27515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204048.27560: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204048.27600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204048.27635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204048.27694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204048.27870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204048.27875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204048.27885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204048.27888: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204048.28141: variable 'network_connections' from source: task vars 8975 1727204048.28152: variable 'controller_profile' from source: play vars 8975 1727204048.28223: variable 'controller_profile' from source: play vars 8975 1727204048.28237: variable 'controller_device' from source: play vars 8975 1727204048.28572: variable 'controller_device' from source: play vars 8975 1727204048.28575: variable 'port1_profile' from source: play vars 8975 1727204048.28577: variable 'port1_profile' from source: play vars 8975 1727204048.28675: variable 'dhcp_interface1' from source: play vars 8975 1727204048.28679: variable 'dhcp_interface1' from source: play vars 8975 1727204048.28681: variable 'controller_profile' from source: play vars 8975 1727204048.28809: variable 'controller_profile' from source: play vars 8975 1727204048.28821: variable 'port2_profile' from source: play vars 8975 1727204048.28880: variable 'port2_profile' from source: play vars 8975 1727204048.28981: variable 'dhcp_interface2' from source: play vars 8975 1727204048.29039: variable 'dhcp_interface2' from source: play vars 8975 1727204048.29470: variable 'controller_profile' from source: play vars 8975 1727204048.29473: variable 'controller_profile' from source: play vars 8975 1727204048.29476: Evaluated conditional (__network_wpa_supplicant_required): False 8975 1727204048.29479: when evaluation is False, skipping this task 8975 1727204048.29481: _execute() done 8975 1727204048.29483: dumping result to json 8975 1727204048.29485: done dumping result, returning 8975 1727204048.29491: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-9356-306d-000000000033] 8975 1727204048.29493: sending task result for task 127b8e07-fff9-9356-306d-000000000033 8975 1727204048.29578: done sending task result for task 127b8e07-fff9-9356-306d-000000000033 8975 1727204048.29582: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8975 1727204048.29636: no more pending results, returning what we have 8975 1727204048.29639: results queue empty 8975 1727204048.29640: checking for any_errors_fatal 8975 1727204048.29664: done checking for any_errors_fatal 8975 1727204048.29664: checking for max_fail_percentage 8975 1727204048.29675: done checking for max_fail_percentage 8975 1727204048.29683: checking to see if all hosts have failed and the running result is not ok 8975 1727204048.29685: done checking to see if all hosts have failed 8975 1727204048.29685: getting the remaining hosts for this loop 8975 1727204048.29687: done getting the remaining hosts for this loop 8975 1727204048.29692: getting the next task for host managed-node2 8975 1727204048.29701: done getting next task for host managed-node2 8975 1727204048.29705: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8975 1727204048.29708: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204048.29723: getting variables 8975 1727204048.29724: in VariableManager get_vars() 8975 1727204048.29859: Calling all_inventory to load vars for managed-node2 8975 1727204048.29863: Calling groups_inventory to load vars for managed-node2 8975 1727204048.29868: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204048.29880: Calling all_plugins_play to load vars for managed-node2 8975 1727204048.29884: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204048.29980: Calling groups_plugins_play to load vars for managed-node2 8975 1727204048.32107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204048.36223: done with get_vars() 8975 1727204048.36262: done getting variables 8975 1727204048.36534: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.154) 0:00:19.684 ***** 8975 1727204048.36777: entering _queue_task() for managed-node2/service 8975 1727204048.37356: worker is 1 (out of 1 available) 8975 1727204048.37375: exiting _queue_task() for managed-node2/service 8975 1727204048.37390: done queuing things up, now waiting for results queue to drain 8975 1727204048.37392: waiting for pending results... 8975 1727204048.38091: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 8975 1727204048.38391: in run() - task 127b8e07-fff9-9356-306d-000000000034 8975 1727204048.38480: variable 'ansible_search_path' from source: unknown 8975 1727204048.38512: variable 'ansible_search_path' from source: unknown 8975 1727204048.38612: calling self._execute() 8975 1727204048.38749: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.38756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.38777: variable 'omit' from source: magic vars 8975 1727204048.39224: variable 'ansible_distribution_major_version' from source: facts 8975 1727204048.39238: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204048.39456: variable 'network_provider' from source: set_fact 8975 1727204048.39459: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204048.39462: when evaluation is False, skipping this task 8975 1727204048.39464: _execute() done 8975 1727204048.39468: dumping result to json 8975 1727204048.39470: done dumping result, returning 8975 1727204048.39472: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-9356-306d-000000000034] 8975 1727204048.39474: sending task result for task 127b8e07-fff9-9356-306d-000000000034 8975 1727204048.39548: done sending task result for task 127b8e07-fff9-9356-306d-000000000034 8975 1727204048.39552: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204048.39602: no more pending results, returning what we have 8975 1727204048.39606: results queue empty 8975 1727204048.39607: checking for any_errors_fatal 8975 1727204048.39620: done checking for any_errors_fatal 8975 1727204048.39621: checking for max_fail_percentage 8975 1727204048.39623: done checking for max_fail_percentage 8975 1727204048.39624: checking to see if all hosts have failed and the running result is not ok 8975 1727204048.39625: done checking to see if all hosts have failed 8975 1727204048.39626: getting the remaining hosts for this loop 8975 1727204048.39628: done getting the remaining hosts for this loop 8975 1727204048.39632: getting the next task for host managed-node2 8975 1727204048.39642: done getting next task for host managed-node2 8975 1727204048.39646: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8975 1727204048.39649: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204048.39669: getting variables 8975 1727204048.39671: in VariableManager get_vars() 8975 1727204048.39720: Calling all_inventory to load vars for managed-node2 8975 1727204048.39723: Calling groups_inventory to load vars for managed-node2 8975 1727204048.39725: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204048.39741: Calling all_plugins_play to load vars for managed-node2 8975 1727204048.39744: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204048.39747: Calling groups_plugins_play to load vars for managed-node2 8975 1727204048.42359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204048.44923: done with get_vars() 8975 1727204048.44970: done getting variables 8975 1727204048.45042: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.083) 0:00:19.767 ***** 8975 1727204048.45085: entering _queue_task() for managed-node2/copy 8975 1727204048.45683: worker is 1 (out of 1 available) 8975 1727204048.45698: exiting _queue_task() for managed-node2/copy 8975 1727204048.45713: done queuing things up, now waiting for results queue to drain 8975 1727204048.45715: waiting for pending results... 8975 1727204048.46471: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8975 1727204048.46653: in run() - task 127b8e07-fff9-9356-306d-000000000035 8975 1727204048.46682: variable 'ansible_search_path' from source: unknown 8975 1727204048.46716: variable 'ansible_search_path' from source: unknown 8975 1727204048.46770: calling self._execute() 8975 1727204048.47064: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.47081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.47106: variable 'omit' from source: magic vars 8975 1727204048.47630: variable 'ansible_distribution_major_version' from source: facts 8975 1727204048.47652: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204048.47802: variable 'network_provider' from source: set_fact 8975 1727204048.47814: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204048.47837: when evaluation is False, skipping this task 8975 1727204048.47840: _execute() done 8975 1727204048.47845: dumping result to json 8975 1727204048.47871: done dumping result, returning 8975 1727204048.47876: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-9356-306d-000000000035] 8975 1727204048.47880: sending task result for task 127b8e07-fff9-9356-306d-000000000035 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8975 1727204048.48200: no more pending results, returning what we have 8975 1727204048.48205: results queue empty 8975 1727204048.48206: checking for any_errors_fatal 8975 1727204048.48212: done checking for any_errors_fatal 8975 1727204048.48213: checking for max_fail_percentage 8975 1727204048.48215: done checking for max_fail_percentage 8975 1727204048.48216: checking to see if all hosts have failed and the running result is not ok 8975 1727204048.48217: done checking to see if all hosts have failed 8975 1727204048.48218: getting the remaining hosts for this loop 8975 1727204048.48220: done getting the remaining hosts for this loop 8975 1727204048.48224: getting the next task for host managed-node2 8975 1727204048.48279: done getting next task for host managed-node2 8975 1727204048.48284: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8975 1727204048.48289: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204048.48308: getting variables 8975 1727204048.48310: in VariableManager get_vars() 8975 1727204048.48471: Calling all_inventory to load vars for managed-node2 8975 1727204048.48475: Calling groups_inventory to load vars for managed-node2 8975 1727204048.48477: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204048.48485: done sending task result for task 127b8e07-fff9-9356-306d-000000000035 8975 1727204048.48490: WORKER PROCESS EXITING 8975 1727204048.48505: Calling all_plugins_play to load vars for managed-node2 8975 1727204048.48509: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204048.48513: Calling groups_plugins_play to load vars for managed-node2 8975 1727204048.51565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204048.53919: done with get_vars() 8975 1727204048.53962: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.089) 0:00:19.857 ***** 8975 1727204048.54075: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 8975 1727204048.54077: Creating lock for fedora.linux_system_roles.network_connections 8975 1727204048.54484: worker is 1 (out of 1 available) 8975 1727204048.54502: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 8975 1727204048.54517: done queuing things up, now waiting for results queue to drain 8975 1727204048.54518: waiting for pending results... 8975 1727204048.54886: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8975 1727204048.55057: in run() - task 127b8e07-fff9-9356-306d-000000000036 8975 1727204048.55107: variable 'ansible_search_path' from source: unknown 8975 1727204048.55112: variable 'ansible_search_path' from source: unknown 8975 1727204048.55166: calling self._execute() 8975 1727204048.55258: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.55274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.55573: variable 'omit' from source: magic vars 8975 1727204048.55883: variable 'ansible_distribution_major_version' from source: facts 8975 1727204048.55901: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204048.55904: variable 'omit' from source: magic vars 8975 1727204048.56012: variable 'omit' from source: magic vars 8975 1727204048.56150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204048.58914: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204048.59007: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204048.59061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204048.59111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204048.59146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204048.59272: variable 'network_provider' from source: set_fact 8975 1727204048.59414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204048.59462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204048.59571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204048.59579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204048.59582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204048.59641: variable 'omit' from source: magic vars 8975 1727204048.59903: variable 'omit' from source: magic vars 8975 1727204048.59906: variable 'network_connections' from source: task vars 8975 1727204048.59913: variable 'controller_profile' from source: play vars 8975 1727204048.59989: variable 'controller_profile' from source: play vars 8975 1727204048.59997: variable 'controller_device' from source: play vars 8975 1727204048.60070: variable 'controller_device' from source: play vars 8975 1727204048.60079: variable 'port1_profile' from source: play vars 8975 1727204048.60139: variable 'port1_profile' from source: play vars 8975 1727204048.60147: variable 'dhcp_interface1' from source: play vars 8975 1727204048.60216: variable 'dhcp_interface1' from source: play vars 8975 1727204048.60225: variable 'controller_profile' from source: play vars 8975 1727204048.60292: variable 'controller_profile' from source: play vars 8975 1727204048.60299: variable 'port2_profile' from source: play vars 8975 1727204048.60359: variable 'port2_profile' from source: play vars 8975 1727204048.60367: variable 'dhcp_interface2' from source: play vars 8975 1727204048.60434: variable 'dhcp_interface2' from source: play vars 8975 1727204048.60444: variable 'controller_profile' from source: play vars 8975 1727204048.60511: variable 'controller_profile' from source: play vars 8975 1727204048.60725: variable 'omit' from source: magic vars 8975 1727204048.60734: variable '__lsr_ansible_managed' from source: task vars 8975 1727204048.60797: variable '__lsr_ansible_managed' from source: task vars 8975 1727204048.61018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8975 1727204048.61319: Loaded config def from plugin (lookup/template) 8975 1727204048.61323: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8975 1727204048.61326: File lookup term: get_ansible_managed.j2 8975 1727204048.61331: variable 'ansible_search_path' from source: unknown 8975 1727204048.61334: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8975 1727204048.61338: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8975 1727204048.61342: variable 'ansible_search_path' from source: unknown 8975 1727204048.69124: variable 'ansible_managed' from source: unknown 8975 1727204048.69318: variable 'omit' from source: magic vars 8975 1727204048.69471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204048.69476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204048.69479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204048.69482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204048.69484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204048.69500: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204048.69509: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.69518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.69643: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204048.69651: Set connection var ansible_connection to ssh 8975 1727204048.69661: Set connection var ansible_shell_executable to /bin/sh 8975 1727204048.69673: Set connection var ansible_timeout to 10 8975 1727204048.69694: Set connection var ansible_shell_type to sh 8975 1727204048.69713: Set connection var ansible_pipelining to False 8975 1727204048.69747: variable 'ansible_shell_executable' from source: unknown 8975 1727204048.69755: variable 'ansible_connection' from source: unknown 8975 1727204048.69764: variable 'ansible_module_compression' from source: unknown 8975 1727204048.69774: variable 'ansible_shell_type' from source: unknown 8975 1727204048.69781: variable 'ansible_shell_executable' from source: unknown 8975 1727204048.69786: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204048.69792: variable 'ansible_pipelining' from source: unknown 8975 1727204048.69804: variable 'ansible_timeout' from source: unknown 8975 1727204048.69810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204048.70018: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204048.70023: variable 'omit' from source: magic vars 8975 1727204048.70025: starting attempt loop 8975 1727204048.70030: running the handler 8975 1727204048.70033: _low_level_execute_command(): starting 8975 1727204048.70035: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204048.70804: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204048.70826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204048.70907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204048.70961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204048.70980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204048.71013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204048.71121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204048.72896: stdout chunk (state=3): >>>/root <<< 8975 1727204048.73125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204048.73133: stdout chunk (state=3): >>><<< 8975 1727204048.73136: stderr chunk (state=3): >>><<< 8975 1727204048.73269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204048.73273: _low_level_execute_command(): starting 8975 1727204048.73276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039 `" && echo ansible-tmp-1727204048.7316136-10872-149564701403039="` echo /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039 `" ) && sleep 0' 8975 1727204048.73982: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204048.74013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204048.74131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204048.76098: stdout chunk (state=3): >>>ansible-tmp-1727204048.7316136-10872-149564701403039=/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039 <<< 8975 1727204048.76216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204048.76288: stderr chunk (state=3): >>><<< 8975 1727204048.76291: stdout chunk (state=3): >>><<< 8975 1727204048.76308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.7316136-10872-149564701403039=/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204048.76357: variable 'ansible_module_compression' from source: unknown 8975 1727204048.76401: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 8975 1727204048.76405: ANSIBALLZ: Acquiring lock 8975 1727204048.76408: ANSIBALLZ: Lock acquired: 140501804255904 8975 1727204048.76411: ANSIBALLZ: Creating module 8975 1727204049.07968: ANSIBALLZ: Writing module into payload 8975 1727204049.08264: ANSIBALLZ: Writing module 8975 1727204049.08292: ANSIBALLZ: Renaming module 8975 1727204049.08298: ANSIBALLZ: Done creating module 8975 1727204049.08332: variable 'ansible_facts' from source: unknown 8975 1727204049.08533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py 8975 1727204049.08686: Sending initial data 8975 1727204049.08689: Sent initial data (167 bytes) 8975 1727204049.09147: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204049.09151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.09153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204049.09156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.09212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204049.09216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204049.09298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204049.11037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204049.11087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204049.11156: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp94ljljwe /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py <<< 8975 1727204049.11160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py" <<< 8975 1727204049.11255: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp94ljljwe" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py" <<< 8975 1727204049.12375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204049.12473: stderr chunk (state=3): >>><<< 8975 1727204049.12479: stdout chunk (state=3): >>><<< 8975 1727204049.12505: done transferring module to remote 8975 1727204049.12515: _low_level_execute_command(): starting 8975 1727204049.12520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/ /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py && sleep 0' 8975 1727204049.13015: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204049.13018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.13021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204049.13023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.13086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204049.13089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204049.13093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204049.13170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204049.15012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204049.15069: stderr chunk (state=3): >>><<< 8975 1727204049.15073: stdout chunk (state=3): >>><<< 8975 1727204049.15090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204049.15093: _low_level_execute_command(): starting 8975 1727204049.15099: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/AnsiballZ_network_connections.py && sleep 0' 8975 1727204049.15604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204049.15615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.15617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204049.15622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204049.15624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.15674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204049.15678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204049.15691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204049.15763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204049.67574: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 8975 1727204049.67612: stdout chunk (state=3): >>> <<< 8975 1727204049.69973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204049.69978: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 8975 1727204049.69980: stdout chunk (state=3): >>><<< 8975 1727204049.69983: stderr chunk (state=3): >>><<< 8975 1727204049.69985: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204049.70017: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204049.70034: _low_level_execute_command(): starting 8975 1727204049.70040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.7316136-10872-149564701403039/ > /dev/null 2>&1 && sleep 0' 8975 1727204049.70698: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204049.70707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204049.70734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204049.70748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204049.70761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204049.70770: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204049.70783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.70798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204049.70805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204049.70813: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204049.70878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204049.70885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204049.70920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204049.70953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204049.70967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204049.71070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204049.73393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204049.73555: stderr chunk (state=3): >>><<< 8975 1727204049.73559: stdout chunk (state=3): >>><<< 8975 1727204049.73579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204049.73586: handler run complete 8975 1727204049.73871: attempt loop complete, returning result 8975 1727204049.73875: _execute() done 8975 1727204049.73878: dumping result to json 8975 1727204049.73880: done dumping result, returning 8975 1727204049.73882: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-9356-306d-000000000036] 8975 1727204049.73884: sending task result for task 127b8e07-fff9-9356-306d-000000000036 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active) 8975 1727204049.74501: no more pending results, returning what we have 8975 1727204049.74507: results queue empty 8975 1727204049.74508: checking for any_errors_fatal 8975 1727204049.74516: done checking for any_errors_fatal 8975 1727204049.74517: checking for max_fail_percentage 8975 1727204049.74519: done checking for max_fail_percentage 8975 1727204049.74520: checking to see if all hosts have failed and the running result is not ok 8975 1727204049.74522: done checking to see if all hosts have failed 8975 1727204049.74522: getting the remaining hosts for this loop 8975 1727204049.74524: done getting the remaining hosts for this loop 8975 1727204049.74529: getting the next task for host managed-node2 8975 1727204049.74538: done getting next task for host managed-node2 8975 1727204049.74542: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8975 1727204049.74545: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204049.74558: getting variables 8975 1727204049.74560: in VariableManager get_vars() 8975 1727204049.74791: Calling all_inventory to load vars for managed-node2 8975 1727204049.74795: Calling groups_inventory to load vars for managed-node2 8975 1727204049.74797: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204049.74806: done sending task result for task 127b8e07-fff9-9356-306d-000000000036 8975 1727204049.74827: WORKER PROCESS EXITING 8975 1727204049.74839: Calling all_plugins_play to load vars for managed-node2 8975 1727204049.74842: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204049.74846: Calling groups_plugins_play to load vars for managed-node2 8975 1727204049.78849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204049.81363: done with get_vars() 8975 1727204049.81394: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:09 -0400 (0:00:01.273) 0:00:21.131 ***** 8975 1727204049.81471: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 8975 1727204049.81473: Creating lock for fedora.linux_system_roles.network_state 8975 1727204049.81769: worker is 1 (out of 1 available) 8975 1727204049.81785: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 8975 1727204049.81799: done queuing things up, now waiting for results queue to drain 8975 1727204049.81800: waiting for pending results... 8975 1727204049.81983: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 8975 1727204049.82088: in run() - task 127b8e07-fff9-9356-306d-000000000037 8975 1727204049.82101: variable 'ansible_search_path' from source: unknown 8975 1727204049.82105: variable 'ansible_search_path' from source: unknown 8975 1727204049.82142: calling self._execute() 8975 1727204049.82218: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204049.82224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204049.82234: variable 'omit' from source: magic vars 8975 1727204049.82572: variable 'ansible_distribution_major_version' from source: facts 8975 1727204049.82578: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204049.82716: variable 'network_state' from source: role '' defaults 8975 1727204049.82720: Evaluated conditional (network_state != {}): False 8975 1727204049.82723: when evaluation is False, skipping this task 8975 1727204049.82726: _execute() done 8975 1727204049.82732: dumping result to json 8975 1727204049.82734: done dumping result, returning 8975 1727204049.82738: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-9356-306d-000000000037] 8975 1727204049.82744: sending task result for task 127b8e07-fff9-9356-306d-000000000037 8975 1727204049.82937: done sending task result for task 127b8e07-fff9-9356-306d-000000000037 8975 1727204049.82940: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204049.83025: no more pending results, returning what we have 8975 1727204049.83031: results queue empty 8975 1727204049.83032: checking for any_errors_fatal 8975 1727204049.83042: done checking for any_errors_fatal 8975 1727204049.83043: checking for max_fail_percentage 8975 1727204049.83045: done checking for max_fail_percentage 8975 1727204049.83045: checking to see if all hosts have failed and the running result is not ok 8975 1727204049.83047: done checking to see if all hosts have failed 8975 1727204049.83047: getting the remaining hosts for this loop 8975 1727204049.83049: done getting the remaining hosts for this loop 8975 1727204049.83052: getting the next task for host managed-node2 8975 1727204049.83059: done getting next task for host managed-node2 8975 1727204049.83063: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8975 1727204049.83068: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204049.83087: getting variables 8975 1727204049.83088: in VariableManager get_vars() 8975 1727204049.83134: Calling all_inventory to load vars for managed-node2 8975 1727204049.83153: Calling groups_inventory to load vars for managed-node2 8975 1727204049.83157: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204049.83170: Calling all_plugins_play to load vars for managed-node2 8975 1727204049.83173: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204049.83176: Calling groups_plugins_play to load vars for managed-node2 8975 1727204049.85637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204049.86855: done with get_vars() 8975 1727204049.86888: done getting variables 8975 1727204049.86954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.055) 0:00:21.186 ***** 8975 1727204049.86999: entering _queue_task() for managed-node2/debug 8975 1727204049.87601: worker is 1 (out of 1 available) 8975 1727204049.87613: exiting _queue_task() for managed-node2/debug 8975 1727204049.87623: done queuing things up, now waiting for results queue to drain 8975 1727204049.87624: waiting for pending results... 8975 1727204049.87950: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8975 1727204049.87961: in run() - task 127b8e07-fff9-9356-306d-000000000038 8975 1727204049.87979: variable 'ansible_search_path' from source: unknown 8975 1727204049.87987: variable 'ansible_search_path' from source: unknown 8975 1727204049.88035: calling self._execute() 8975 1727204049.88141: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204049.88157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204049.88196: variable 'omit' from source: magic vars 8975 1727204049.89196: variable 'ansible_distribution_major_version' from source: facts 8975 1727204049.89202: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204049.89218: variable 'omit' from source: magic vars 8975 1727204049.89473: variable 'omit' from source: magic vars 8975 1727204049.89477: variable 'omit' from source: magic vars 8975 1727204049.89599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204049.89659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204049.89696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204049.89722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204049.89745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204049.89789: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204049.89798: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204049.89805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204049.89921: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204049.89929: Set connection var ansible_connection to ssh 8975 1727204049.89941: Set connection var ansible_shell_executable to /bin/sh 8975 1727204049.89955: Set connection var ansible_timeout to 10 8975 1727204049.90057: Set connection var ansible_shell_type to sh 8975 1727204049.90060: Set connection var ansible_pipelining to False 8975 1727204049.90062: variable 'ansible_shell_executable' from source: unknown 8975 1727204049.90064: variable 'ansible_connection' from source: unknown 8975 1727204049.90069: variable 'ansible_module_compression' from source: unknown 8975 1727204049.90071: variable 'ansible_shell_type' from source: unknown 8975 1727204049.90073: variable 'ansible_shell_executable' from source: unknown 8975 1727204049.90075: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204049.90076: variable 'ansible_pipelining' from source: unknown 8975 1727204049.90078: variable 'ansible_timeout' from source: unknown 8975 1727204049.90080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204049.90210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204049.90229: variable 'omit' from source: magic vars 8975 1727204049.90239: starting attempt loop 8975 1727204049.90246: running the handler 8975 1727204049.90411: variable '__network_connections_result' from source: set_fact 8975 1727204049.90483: handler run complete 8975 1727204049.90510: attempt loop complete, returning result 8975 1727204049.90517: _execute() done 8975 1727204049.90524: dumping result to json 8975 1727204049.90531: done dumping result, returning 8975 1727204049.90544: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-9356-306d-000000000038] 8975 1727204049.90554: sending task result for task 127b8e07-fff9-9356-306d-000000000038 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active)" ] } 8975 1727204049.90808: no more pending results, returning what we have 8975 1727204049.90814: results queue empty 8975 1727204049.90815: checking for any_errors_fatal 8975 1727204049.90822: done checking for any_errors_fatal 8975 1727204049.90823: checking for max_fail_percentage 8975 1727204049.90825: done checking for max_fail_percentage 8975 1727204049.90826: checking to see if all hosts have failed and the running result is not ok 8975 1727204049.90827: done checking to see if all hosts have failed 8975 1727204049.90828: getting the remaining hosts for this loop 8975 1727204049.90830: done getting the remaining hosts for this loop 8975 1727204049.90834: getting the next task for host managed-node2 8975 1727204049.90842: done getting next task for host managed-node2 8975 1727204049.90847: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8975 1727204049.90850: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204049.90861: getting variables 8975 1727204049.90863: in VariableManager get_vars() 8975 1727204049.90911: Calling all_inventory to load vars for managed-node2 8975 1727204049.90914: Calling groups_inventory to load vars for managed-node2 8975 1727204049.90917: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204049.90929: Calling all_plugins_play to load vars for managed-node2 8975 1727204049.90933: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204049.90936: Calling groups_plugins_play to load vars for managed-node2 8975 1727204049.91486: done sending task result for task 127b8e07-fff9-9356-306d-000000000038 8975 1727204049.91491: WORKER PROCESS EXITING 8975 1727204049.93112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204049.96982: done with get_vars() 8975 1727204049.97029: done getting variables 8975 1727204049.97432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.104) 0:00:21.291 ***** 8975 1727204049.97501: entering _queue_task() for managed-node2/debug 8975 1727204049.98316: worker is 1 (out of 1 available) 8975 1727204049.98332: exiting _queue_task() for managed-node2/debug 8975 1727204049.98348: done queuing things up, now waiting for results queue to drain 8975 1727204049.98349: waiting for pending results... 8975 1727204049.99237: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8975 1727204049.99638: in run() - task 127b8e07-fff9-9356-306d-000000000039 8975 1727204049.99643: variable 'ansible_search_path' from source: unknown 8975 1727204049.99646: variable 'ansible_search_path' from source: unknown 8975 1727204049.99975: calling self._execute() 8975 1727204050.00134: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.00139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.00142: variable 'omit' from source: magic vars 8975 1727204050.00959: variable 'ansible_distribution_major_version' from source: facts 8975 1727204050.00973: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204050.00979: variable 'omit' from source: magic vars 8975 1727204050.01473: variable 'omit' from source: magic vars 8975 1727204050.01477: variable 'omit' from source: magic vars 8975 1727204050.01567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204050.01612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204050.01637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204050.01773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.01788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.01826: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204050.01829: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.01836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.02144: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204050.02149: Set connection var ansible_connection to ssh 8975 1727204050.02151: Set connection var ansible_shell_executable to /bin/sh 8975 1727204050.02159: Set connection var ansible_timeout to 10 8975 1727204050.02162: Set connection var ansible_shell_type to sh 8975 1727204050.02177: Set connection var ansible_pipelining to False 8975 1727204050.02317: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.02320: variable 'ansible_connection' from source: unknown 8975 1727204050.02324: variable 'ansible_module_compression' from source: unknown 8975 1727204050.02326: variable 'ansible_shell_type' from source: unknown 8975 1727204050.02328: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.02334: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.02338: variable 'ansible_pipelining' from source: unknown 8975 1727204050.02349: variable 'ansible_timeout' from source: unknown 8975 1727204050.02354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.02630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204050.02644: variable 'omit' from source: magic vars 8975 1727204050.02649: starting attempt loop 8975 1727204050.02652: running the handler 8975 1727204050.02872: variable '__network_connections_result' from source: set_fact 8975 1727204050.03125: variable '__network_connections_result' from source: set_fact 8975 1727204050.03403: handler run complete 8975 1727204050.03441: attempt loop complete, returning result 8975 1727204050.03454: _execute() done 8975 1727204050.03461: dumping result to json 8975 1727204050.03473: done dumping result, returning 8975 1727204050.03486: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-9356-306d-000000000039] 8975 1727204050.03496: sending task result for task 127b8e07-fff9-9356-306d-000000000039 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 717cc466-aa3c-4897-acd8-59beced800de", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 717cc466-aa3c-4897-acd8-59beced800de (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0bd07c0f-2e8d-423f-a6c9-7d5983583758 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, ffedb32f-704a-41b8-a516-608af8e03c14 (not-active)" ] } } 8975 1727204050.03855: no more pending results, returning what we have 8975 1727204050.03859: results queue empty 8975 1727204050.03860: checking for any_errors_fatal 8975 1727204050.03869: done checking for any_errors_fatal 8975 1727204050.03870: checking for max_fail_percentage 8975 1727204050.03880: done checking for max_fail_percentage 8975 1727204050.03881: checking to see if all hosts have failed and the running result is not ok 8975 1727204050.03882: done checking to see if all hosts have failed 8975 1727204050.03883: getting the remaining hosts for this loop 8975 1727204050.03885: done getting the remaining hosts for this loop 8975 1727204050.03890: getting the next task for host managed-node2 8975 1727204050.03898: done getting next task for host managed-node2 8975 1727204050.03902: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8975 1727204050.03905: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204050.03918: getting variables 8975 1727204050.03920: in VariableManager get_vars() 8975 1727204050.04171: Calling all_inventory to load vars for managed-node2 8975 1727204050.04174: Calling groups_inventory to load vars for managed-node2 8975 1727204050.04177: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.04188: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.04191: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.04194: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.04896: done sending task result for task 127b8e07-fff9-9356-306d-000000000039 8975 1727204050.04900: WORKER PROCESS EXITING 8975 1727204050.06456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.09905: done with get_vars() 8975 1727204050.09950: done getting variables 8975 1727204050.10015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.125) 0:00:21.417 ***** 8975 1727204050.10052: entering _queue_task() for managed-node2/debug 8975 1727204050.10524: worker is 1 (out of 1 available) 8975 1727204050.10543: exiting _queue_task() for managed-node2/debug 8975 1727204050.10558: done queuing things up, now waiting for results queue to drain 8975 1727204050.10559: waiting for pending results... 8975 1727204050.10959: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8975 1727204050.11201: in run() - task 127b8e07-fff9-9356-306d-00000000003a 8975 1727204050.11271: variable 'ansible_search_path' from source: unknown 8975 1727204050.11280: variable 'ansible_search_path' from source: unknown 8975 1727204050.11335: calling self._execute() 8975 1727204050.11599: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.11611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.11631: variable 'omit' from source: magic vars 8975 1727204050.12574: variable 'ansible_distribution_major_version' from source: facts 8975 1727204050.12578: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204050.12705: variable 'network_state' from source: role '' defaults 8975 1727204050.12725: Evaluated conditional (network_state != {}): False 8975 1727204050.12736: when evaluation is False, skipping this task 8975 1727204050.12765: _execute() done 8975 1727204050.12809: dumping result to json 8975 1727204050.12818: done dumping result, returning 8975 1727204050.12836: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-9356-306d-00000000003a] 8975 1727204050.12850: sending task result for task 127b8e07-fff9-9356-306d-00000000003a 8975 1727204050.13273: done sending task result for task 127b8e07-fff9-9356-306d-00000000003a 8975 1727204050.13277: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 8975 1727204050.13335: no more pending results, returning what we have 8975 1727204050.13339: results queue empty 8975 1727204050.13340: checking for any_errors_fatal 8975 1727204050.13349: done checking for any_errors_fatal 8975 1727204050.13350: checking for max_fail_percentage 8975 1727204050.13353: done checking for max_fail_percentage 8975 1727204050.13354: checking to see if all hosts have failed and the running result is not ok 8975 1727204050.13355: done checking to see if all hosts have failed 8975 1727204050.13356: getting the remaining hosts for this loop 8975 1727204050.13358: done getting the remaining hosts for this loop 8975 1727204050.13363: getting the next task for host managed-node2 8975 1727204050.13373: done getting next task for host managed-node2 8975 1727204050.13378: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8975 1727204050.13382: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204050.13401: getting variables 8975 1727204050.13403: in VariableManager get_vars() 8975 1727204050.13459: Calling all_inventory to load vars for managed-node2 8975 1727204050.13462: Calling groups_inventory to load vars for managed-node2 8975 1727204050.13464: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.13880: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.13884: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.13887: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.16845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.19037: done with get_vars() 8975 1727204050.19079: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.091) 0:00:21.508 ***** 8975 1727204050.19199: entering _queue_task() for managed-node2/ping 8975 1727204050.19201: Creating lock for ping 8975 1727204050.19782: worker is 1 (out of 1 available) 8975 1727204050.19793: exiting _queue_task() for managed-node2/ping 8975 1727204050.19805: done queuing things up, now waiting for results queue to drain 8975 1727204050.19807: waiting for pending results... 8975 1727204050.19960: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 8975 1727204050.20122: in run() - task 127b8e07-fff9-9356-306d-00000000003b 8975 1727204050.20161: variable 'ansible_search_path' from source: unknown 8975 1727204050.20176: variable 'ansible_search_path' from source: unknown 8975 1727204050.20218: calling self._execute() 8975 1727204050.20325: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.20340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.20470: variable 'omit' from source: magic vars 8975 1727204050.20804: variable 'ansible_distribution_major_version' from source: facts 8975 1727204050.20826: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204050.20842: variable 'omit' from source: magic vars 8975 1727204050.20921: variable 'omit' from source: magic vars 8975 1727204050.20970: variable 'omit' from source: magic vars 8975 1727204050.21030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204050.21079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204050.21108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204050.21141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.21160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.21201: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204050.21211: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.21220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.21353: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204050.21362: Set connection var ansible_connection to ssh 8975 1727204050.21375: Set connection var ansible_shell_executable to /bin/sh 8975 1727204050.21386: Set connection var ansible_timeout to 10 8975 1727204050.21393: Set connection var ansible_shell_type to sh 8975 1727204050.21419: Set connection var ansible_pipelining to False 8975 1727204050.21561: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.21565: variable 'ansible_connection' from source: unknown 8975 1727204050.21571: variable 'ansible_module_compression' from source: unknown 8975 1727204050.21573: variable 'ansible_shell_type' from source: unknown 8975 1727204050.21576: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.21578: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.21581: variable 'ansible_pipelining' from source: unknown 8975 1727204050.21583: variable 'ansible_timeout' from source: unknown 8975 1727204050.21585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.21734: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204050.21751: variable 'omit' from source: magic vars 8975 1727204050.21760: starting attempt loop 8975 1727204050.21771: running the handler 8975 1727204050.21794: _low_level_execute_command(): starting 8975 1727204050.21807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204050.22675: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.22725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204050.22759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.22807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.22880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.24679: stdout chunk (state=3): >>>/root <<< 8975 1727204050.24853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.24881: stderr chunk (state=3): >>><<< 8975 1727204050.24885: stdout chunk (state=3): >>><<< 8975 1727204050.25097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204050.25101: _low_level_execute_command(): starting 8975 1727204050.25104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697 `" && echo ansible-tmp-1727204050.2491663-11003-1073472065697="` echo /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697 `" ) && sleep 0' 8975 1727204050.25657: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204050.25661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204050.25677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204050.25693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204050.25706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204050.25714: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204050.25724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.25747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204050.25888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204050.25896: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.25926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.25932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.26044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.28064: stdout chunk (state=3): >>>ansible-tmp-1727204050.2491663-11003-1073472065697=/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697 <<< 8975 1727204050.28357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.28361: stdout chunk (state=3): >>><<< 8975 1727204050.28364: stderr chunk (state=3): >>><<< 8975 1727204050.28571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.2491663-11003-1073472065697=/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204050.28575: variable 'ansible_module_compression' from source: unknown 8975 1727204050.28578: ANSIBALLZ: Using lock for ping 8975 1727204050.28580: ANSIBALLZ: Acquiring lock 8975 1727204050.28583: ANSIBALLZ: Lock acquired: 140501801077408 8975 1727204050.28585: ANSIBALLZ: Creating module 8975 1727204050.43024: ANSIBALLZ: Writing module into payload 8975 1727204050.43107: ANSIBALLZ: Writing module 8975 1727204050.43133: ANSIBALLZ: Renaming module 8975 1727204050.43149: ANSIBALLZ: Done creating module 8975 1727204050.43170: variable 'ansible_facts' from source: unknown 8975 1727204050.43241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py 8975 1727204050.43487: Sending initial data 8975 1727204050.43490: Sent initial data (150 bytes) 8975 1727204050.44148: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204050.44158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204050.44172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204050.44187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204050.44221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204050.44225: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204050.44228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.44245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204050.44328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.44346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.44459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.46202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204050.46269: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204050.46334: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpwkr_11iw /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py <<< 8975 1727204050.46348: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py" <<< 8975 1727204050.46403: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpwkr_11iw" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py" <<< 8975 1727204050.47072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.47170: stderr chunk (state=3): >>><<< 8975 1727204050.47174: stdout chunk (state=3): >>><<< 8975 1727204050.47194: done transferring module to remote 8975 1727204050.47204: _low_level_execute_command(): starting 8975 1727204050.47210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/ /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py && sleep 0' 8975 1727204050.47728: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204050.47733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204050.47741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.47796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204050.47800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.47806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.47891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.49802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.49873: stderr chunk (state=3): >>><<< 8975 1727204050.49970: stdout chunk (state=3): >>><<< 8975 1727204050.49976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204050.49979: _low_level_execute_command(): starting 8975 1727204050.49982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/AnsiballZ_ping.py && sleep 0' 8975 1727204050.50656: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204050.50703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204050.50775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.50844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204050.50874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.50904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.51027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.67584: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8975 1727204050.68942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204050.68998: stderr chunk (state=3): >>><<< 8975 1727204050.69001: stdout chunk (state=3): >>><<< 8975 1727204050.69016: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204050.69039: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204050.69053: _low_level_execute_command(): starting 8975 1727204050.69057: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.2491663-11003-1073472065697/ > /dev/null 2>&1 && sleep 0' 8975 1727204050.69570: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204050.69575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.69577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204050.69580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204050.69583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.69642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204050.69646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.69657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.69724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.71680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.71738: stderr chunk (state=3): >>><<< 8975 1727204050.71742: stdout chunk (state=3): >>><<< 8975 1727204050.71757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204050.71764: handler run complete 8975 1727204050.71779: attempt loop complete, returning result 8975 1727204050.71782: _execute() done 8975 1727204050.71785: dumping result to json 8975 1727204050.71787: done dumping result, returning 8975 1727204050.71796: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-9356-306d-00000000003b] 8975 1727204050.71801: sending task result for task 127b8e07-fff9-9356-306d-00000000003b 8975 1727204050.71901: done sending task result for task 127b8e07-fff9-9356-306d-00000000003b 8975 1727204050.71905: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 8975 1727204050.71969: no more pending results, returning what we have 8975 1727204050.71973: results queue empty 8975 1727204050.71974: checking for any_errors_fatal 8975 1727204050.71979: done checking for any_errors_fatal 8975 1727204050.71980: checking for max_fail_percentage 8975 1727204050.71982: done checking for max_fail_percentage 8975 1727204050.71983: checking to see if all hosts have failed and the running result is not ok 8975 1727204050.71984: done checking to see if all hosts have failed 8975 1727204050.71985: getting the remaining hosts for this loop 8975 1727204050.71986: done getting the remaining hosts for this loop 8975 1727204050.71990: getting the next task for host managed-node2 8975 1727204050.72001: done getting next task for host managed-node2 8975 1727204050.72003: ^ task is: TASK: meta (role_complete) 8975 1727204050.72006: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204050.72017: getting variables 8975 1727204050.72019: in VariableManager get_vars() 8975 1727204050.72108: Calling all_inventory to load vars for managed-node2 8975 1727204050.72112: Calling groups_inventory to load vars for managed-node2 8975 1727204050.72114: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.72126: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.72131: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.72134: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.73242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.74421: done with get_vars() 8975 1727204050.74451: done getting variables 8975 1727204050.74524: done queuing things up, now waiting for results queue to drain 8975 1727204050.74526: results queue empty 8975 1727204050.74526: checking for any_errors_fatal 8975 1727204050.74530: done checking for any_errors_fatal 8975 1727204050.74531: checking for max_fail_percentage 8975 1727204050.74532: done checking for max_fail_percentage 8975 1727204050.74532: checking to see if all hosts have failed and the running result is not ok 8975 1727204050.74533: done checking to see if all hosts have failed 8975 1727204050.74533: getting the remaining hosts for this loop 8975 1727204050.74534: done getting the remaining hosts for this loop 8975 1727204050.74536: getting the next task for host managed-node2 8975 1727204050.74540: done getting next task for host managed-node2 8975 1727204050.74542: ^ task is: TASK: Include the task 'get_interface_stat.yml' 8975 1727204050.74544: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204050.74545: getting variables 8975 1727204050.74546: in VariableManager get_vars() 8975 1727204050.74559: Calling all_inventory to load vars for managed-node2 8975 1727204050.74561: Calling groups_inventory to load vars for managed-node2 8975 1727204050.74562: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.74569: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.74571: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.74574: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.75510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.76712: done with get_vars() 8975 1727204050.76741: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.576) 0:00:22.085 ***** 8975 1727204050.76808: entering _queue_task() for managed-node2/include_tasks 8975 1727204050.77101: worker is 1 (out of 1 available) 8975 1727204050.77116: exiting _queue_task() for managed-node2/include_tasks 8975 1727204050.77132: done queuing things up, now waiting for results queue to drain 8975 1727204050.77133: waiting for pending results... 8975 1727204050.77322: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 8975 1727204050.77424: in run() - task 127b8e07-fff9-9356-306d-00000000006e 8975 1727204050.77437: variable 'ansible_search_path' from source: unknown 8975 1727204050.77441: variable 'ansible_search_path' from source: unknown 8975 1727204050.77482: calling self._execute() 8975 1727204050.77548: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.77551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.77563: variable 'omit' from source: magic vars 8975 1727204050.77886: variable 'ansible_distribution_major_version' from source: facts 8975 1727204050.77896: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204050.77902: _execute() done 8975 1727204050.77907: dumping result to json 8975 1727204050.77911: done dumping result, returning 8975 1727204050.77919: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-9356-306d-00000000006e] 8975 1727204050.77921: sending task result for task 127b8e07-fff9-9356-306d-00000000006e 8975 1727204050.78033: done sending task result for task 127b8e07-fff9-9356-306d-00000000006e 8975 1727204050.78036: WORKER PROCESS EXITING 8975 1727204050.78061: no more pending results, returning what we have 8975 1727204050.78069: in VariableManager get_vars() 8975 1727204050.78119: Calling all_inventory to load vars for managed-node2 8975 1727204050.78122: Calling groups_inventory to load vars for managed-node2 8975 1727204050.78124: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.78141: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.78144: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.78147: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.79243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.80464: done with get_vars() 8975 1727204050.80491: variable 'ansible_search_path' from source: unknown 8975 1727204050.80493: variable 'ansible_search_path' from source: unknown 8975 1727204050.80542: we have included files to process 8975 1727204050.80543: generating all_blocks data 8975 1727204050.80545: done generating all_blocks data 8975 1727204050.80550: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204050.80551: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204050.80553: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 8975 1727204050.80758: done processing included file 8975 1727204050.80761: iterating over new_blocks loaded from include file 8975 1727204050.80763: in VariableManager get_vars() 8975 1727204050.80789: done with get_vars() 8975 1727204050.80791: filtering new block on tags 8975 1727204050.80810: done filtering new block on tags 8975 1727204050.80813: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 8975 1727204050.80818: extending task lists for all hosts with included blocks 8975 1727204050.80933: done extending task lists 8975 1727204050.80935: done processing included files 8975 1727204050.80936: results queue empty 8975 1727204050.80937: checking for any_errors_fatal 8975 1727204050.80939: done checking for any_errors_fatal 8975 1727204050.80940: checking for max_fail_percentage 8975 1727204050.80941: done checking for max_fail_percentage 8975 1727204050.80942: checking to see if all hosts have failed and the running result is not ok 8975 1727204050.80943: done checking to see if all hosts have failed 8975 1727204050.80943: getting the remaining hosts for this loop 8975 1727204050.80945: done getting the remaining hosts for this loop 8975 1727204050.80948: getting the next task for host managed-node2 8975 1727204050.80952: done getting next task for host managed-node2 8975 1727204050.80955: ^ task is: TASK: Get stat for interface {{ interface }} 8975 1727204050.80958: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204050.80960: getting variables 8975 1727204050.80961: in VariableManager get_vars() 8975 1727204050.80980: Calling all_inventory to load vars for managed-node2 8975 1727204050.80983: Calling groups_inventory to load vars for managed-node2 8975 1727204050.80985: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204050.80991: Calling all_plugins_play to load vars for managed-node2 8975 1727204050.80994: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204050.80997: Calling groups_plugins_play to load vars for managed-node2 8975 1727204050.87726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204050.90649: done with get_vars() 8975 1727204050.90692: done getting variables 8975 1727204050.90863: variable 'interface' from source: task vars 8975 1727204050.90868: variable 'controller_device' from source: play vars 8975 1727204050.90935: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.141) 0:00:22.226 ***** 8975 1727204050.90971: entering _queue_task() for managed-node2/stat 8975 1727204050.91356: worker is 1 (out of 1 available) 8975 1727204050.91373: exiting _queue_task() for managed-node2/stat 8975 1727204050.91387: done queuing things up, now waiting for results queue to drain 8975 1727204050.91389: waiting for pending results... 8975 1727204050.91739: running TaskExecutor() for managed-node2/TASK: Get stat for interface deprecated-bond 8975 1727204050.91919: in run() - task 127b8e07-fff9-9356-306d-000000000242 8975 1727204050.91924: variable 'ansible_search_path' from source: unknown 8975 1727204050.91930: variable 'ansible_search_path' from source: unknown 8975 1727204050.91976: calling self._execute() 8975 1727204050.92042: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.92049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.92058: variable 'omit' from source: magic vars 8975 1727204050.92656: variable 'ansible_distribution_major_version' from source: facts 8975 1727204050.92661: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204050.92664: variable 'omit' from source: magic vars 8975 1727204050.92822: variable 'omit' from source: magic vars 8975 1727204050.92829: variable 'interface' from source: task vars 8975 1727204050.92833: variable 'controller_device' from source: play vars 8975 1727204050.93010: variable 'controller_device' from source: play vars 8975 1727204050.93014: variable 'omit' from source: magic vars 8975 1727204050.93017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204050.93020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204050.93119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204050.93123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.93126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204050.93132: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204050.93135: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.93137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.93343: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204050.93347: Set connection var ansible_connection to ssh 8975 1727204050.93350: Set connection var ansible_shell_executable to /bin/sh 8975 1727204050.93352: Set connection var ansible_timeout to 10 8975 1727204050.93355: Set connection var ansible_shell_type to sh 8975 1727204050.93358: Set connection var ansible_pipelining to False 8975 1727204050.93360: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.93362: variable 'ansible_connection' from source: unknown 8975 1727204050.93364: variable 'ansible_module_compression' from source: unknown 8975 1727204050.93370: variable 'ansible_shell_type' from source: unknown 8975 1727204050.93373: variable 'ansible_shell_executable' from source: unknown 8975 1727204050.93375: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204050.93472: variable 'ansible_pipelining' from source: unknown 8975 1727204050.93476: variable 'ansible_timeout' from source: unknown 8975 1727204050.93479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204050.93675: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204050.93679: variable 'omit' from source: magic vars 8975 1727204050.93682: starting attempt loop 8975 1727204050.93684: running the handler 8975 1727204050.93686: _low_level_execute_command(): starting 8975 1727204050.93688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204050.94912: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204050.95036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.95081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.95145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204050.96882: stdout chunk (state=3): >>>/root <<< 8975 1727204050.97375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204050.97378: stdout chunk (state=3): >>><<< 8975 1727204050.97381: stderr chunk (state=3): >>><<< 8975 1727204050.97383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204050.97387: _low_level_execute_command(): starting 8975 1727204050.97389: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460 `" && echo ansible-tmp-1727204050.9726055-11121-33904164166460="` echo /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460 `" ) && sleep 0' 8975 1727204050.98207: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204050.98217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204050.98231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204050.98245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204050.98257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204050.98273: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204050.98276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204050.98355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204050.98378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204050.98387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204050.98502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.00502: stdout chunk (state=3): >>>ansible-tmp-1727204050.9726055-11121-33904164166460=/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460 <<< 8975 1727204051.00701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.00716: stderr chunk (state=3): >>><<< 8975 1727204051.00725: stdout chunk (state=3): >>><<< 8975 1727204051.00755: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.9726055-11121-33904164166460=/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.00817: variable 'ansible_module_compression' from source: unknown 8975 1727204051.00885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204051.00938: variable 'ansible_facts' from source: unknown 8975 1727204051.01033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py 8975 1727204051.01194: Sending initial data 8975 1727204051.01197: Sent initial data (151 bytes) 8975 1727204051.02175: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.02206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.02320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.03962: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204051.04034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204051.04097: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp5zl0zqc8 /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py <<< 8975 1727204051.04101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py" <<< 8975 1727204051.04219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp5zl0zqc8" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py" <<< 8975 1727204051.05939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.05944: stdout chunk (state=3): >>><<< 8975 1727204051.05946: stderr chunk (state=3): >>><<< 8975 1727204051.06388: done transferring module to remote 8975 1727204051.06392: _low_level_execute_command(): starting 8975 1727204051.06395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/ /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py && sleep 0' 8975 1727204051.07587: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204051.07690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.07721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204051.07795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.07813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.07997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.09822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.09935: stderr chunk (state=3): >>><<< 8975 1727204051.09946: stdout chunk (state=3): >>><<< 8975 1727204051.09971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.09984: _low_level_execute_command(): starting 8975 1727204051.09994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/AnsiballZ_stat.py && sleep 0' 8975 1727204051.10669: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204051.10686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204051.10703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.10721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204051.10746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204051.10758: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204051.10784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.10882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.10914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.11017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.27687: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34954, "dev": 23, "nlink": 1, "atime": 1727204049.515743, "mtime": 1727204049.515743, "ctime": 1727204049.515743, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204051.29075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204051.29092: stdout chunk (state=3): >>><<< 8975 1727204051.29104: stderr chunk (state=3): >>><<< 8975 1727204051.29132: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34954, "dev": 23, "nlink": 1, "atime": 1727204049.515743, "mtime": 1727204049.515743, "ctime": 1727204049.515743, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204051.29202: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204051.29230: _low_level_execute_command(): starting 8975 1727204051.29246: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.9726055-11121-33904164166460/ > /dev/null 2>&1 && sleep 0' 8975 1727204051.29961: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204051.29983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204051.30000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.30139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204051.30144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.30164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.30278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.32288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.32292: stdout chunk (state=3): >>><<< 8975 1727204051.32294: stderr chunk (state=3): >>><<< 8975 1727204051.32317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.32344: handler run complete 8975 1727204051.32414: attempt loop complete, returning result 8975 1727204051.32435: _execute() done 8975 1727204051.32438: dumping result to json 8975 1727204051.32471: done dumping result, returning 8975 1727204051.32474: done running TaskExecutor() for managed-node2/TASK: Get stat for interface deprecated-bond [127b8e07-fff9-9356-306d-000000000242] 8975 1727204051.32476: sending task result for task 127b8e07-fff9-9356-306d-000000000242 8975 1727204051.32703: done sending task result for task 127b8e07-fff9-9356-306d-000000000242 8975 1727204051.32706: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204049.515743, "block_size": 4096, "blocks": 0, "ctime": 1727204049.515743, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34954, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1727204049.515743, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8975 1727204051.32823: no more pending results, returning what we have 8975 1727204051.32829: results queue empty 8975 1727204051.32830: checking for any_errors_fatal 8975 1727204051.32831: done checking for any_errors_fatal 8975 1727204051.32832: checking for max_fail_percentage 8975 1727204051.32834: done checking for max_fail_percentage 8975 1727204051.32835: checking to see if all hosts have failed and the running result is not ok 8975 1727204051.32836: done checking to see if all hosts have failed 8975 1727204051.32837: getting the remaining hosts for this loop 8975 1727204051.32839: done getting the remaining hosts for this loop 8975 1727204051.32844: getting the next task for host managed-node2 8975 1727204051.32856: done getting next task for host managed-node2 8975 1727204051.32860: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 8975 1727204051.32863: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204051.33097: getting variables 8975 1727204051.33099: in VariableManager get_vars() 8975 1727204051.33141: Calling all_inventory to load vars for managed-node2 8975 1727204051.33144: Calling groups_inventory to load vars for managed-node2 8975 1727204051.33146: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.33157: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.33160: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.33163: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.34603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.35919: done with get_vars() 8975 1727204051.35949: done getting variables 8975 1727204051.36035: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204051.36173: variable 'interface' from source: task vars 8975 1727204051.36178: variable 'controller_device' from source: play vars 8975 1727204051.36247: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.453) 0:00:22.679 ***** 8975 1727204051.36284: entering _queue_task() for managed-node2/assert 8975 1727204051.36692: worker is 1 (out of 1 available) 8975 1727204051.36704: exiting _queue_task() for managed-node2/assert 8975 1727204051.36717: done queuing things up, now waiting for results queue to drain 8975 1727204051.36718: waiting for pending results... 8975 1727204051.37032: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'deprecated-bond' 8975 1727204051.37149: in run() - task 127b8e07-fff9-9356-306d-00000000006f 8975 1727204051.37233: variable 'ansible_search_path' from source: unknown 8975 1727204051.37236: variable 'ansible_search_path' from source: unknown 8975 1727204051.37239: calling self._execute() 8975 1727204051.37335: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.37355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.37372: variable 'omit' from source: magic vars 8975 1727204051.37851: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.37883: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.37903: variable 'omit' from source: magic vars 8975 1727204051.37971: variable 'omit' from source: magic vars 8975 1727204051.38089: variable 'interface' from source: task vars 8975 1727204051.38121: variable 'controller_device' from source: play vars 8975 1727204051.38190: variable 'controller_device' from source: play vars 8975 1727204051.38230: variable 'omit' from source: magic vars 8975 1727204051.38316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204051.38341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204051.38361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204051.38405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.38408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.38444: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204051.38448: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.38450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.38532: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204051.38535: Set connection var ansible_connection to ssh 8975 1727204051.38542: Set connection var ansible_shell_executable to /bin/sh 8975 1727204051.38551: Set connection var ansible_timeout to 10 8975 1727204051.38554: Set connection var ansible_shell_type to sh 8975 1727204051.38563: Set connection var ansible_pipelining to False 8975 1727204051.38592: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.38595: variable 'ansible_connection' from source: unknown 8975 1727204051.38598: variable 'ansible_module_compression' from source: unknown 8975 1727204051.38601: variable 'ansible_shell_type' from source: unknown 8975 1727204051.38604: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.38606: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.38608: variable 'ansible_pipelining' from source: unknown 8975 1727204051.38611: variable 'ansible_timeout' from source: unknown 8975 1727204051.38616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.38736: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204051.38746: variable 'omit' from source: magic vars 8975 1727204051.38751: starting attempt loop 8975 1727204051.38755: running the handler 8975 1727204051.38871: variable 'interface_stat' from source: set_fact 8975 1727204051.38893: Evaluated conditional (interface_stat.stat.exists): True 8975 1727204051.38896: handler run complete 8975 1727204051.38908: attempt loop complete, returning result 8975 1727204051.38911: _execute() done 8975 1727204051.38914: dumping result to json 8975 1727204051.38917: done dumping result, returning 8975 1727204051.38925: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'deprecated-bond' [127b8e07-fff9-9356-306d-00000000006f] 8975 1727204051.38933: sending task result for task 127b8e07-fff9-9356-306d-00000000006f 8975 1727204051.39033: done sending task result for task 127b8e07-fff9-9356-306d-00000000006f 8975 1727204051.39036: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204051.39091: no more pending results, returning what we have 8975 1727204051.39095: results queue empty 8975 1727204051.39096: checking for any_errors_fatal 8975 1727204051.39107: done checking for any_errors_fatal 8975 1727204051.39107: checking for max_fail_percentage 8975 1727204051.39109: done checking for max_fail_percentage 8975 1727204051.39111: checking to see if all hosts have failed and the running result is not ok 8975 1727204051.39112: done checking to see if all hosts have failed 8975 1727204051.39113: getting the remaining hosts for this loop 8975 1727204051.39115: done getting the remaining hosts for this loop 8975 1727204051.39119: getting the next task for host managed-node2 8975 1727204051.39135: done getting next task for host managed-node2 8975 1727204051.39139: ^ task is: TASK: Include the task 'assert_profile_present.yml' 8975 1727204051.39140: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204051.39151: getting variables 8975 1727204051.39153: in VariableManager get_vars() 8975 1727204051.39196: Calling all_inventory to load vars for managed-node2 8975 1727204051.39199: Calling groups_inventory to load vars for managed-node2 8975 1727204051.39201: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.39213: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.39216: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.39219: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.40249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.41990: done with get_vars() 8975 1727204051.42019: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.058) 0:00:22.737 ***** 8975 1727204051.42096: entering _queue_task() for managed-node2/include_tasks 8975 1727204051.42380: worker is 1 (out of 1 available) 8975 1727204051.42396: exiting _queue_task() for managed-node2/include_tasks 8975 1727204051.42410: done queuing things up, now waiting for results queue to drain 8975 1727204051.42412: waiting for pending results... 8975 1727204051.42607: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' 8975 1727204051.42687: in run() - task 127b8e07-fff9-9356-306d-000000000070 8975 1727204051.42697: variable 'ansible_search_path' from source: unknown 8975 1727204051.42745: variable 'controller_profile' from source: play vars 8975 1727204051.42908: variable 'controller_profile' from source: play vars 8975 1727204051.42920: variable 'port1_profile' from source: play vars 8975 1727204051.42978: variable 'port1_profile' from source: play vars 8975 1727204051.42985: variable 'port2_profile' from source: play vars 8975 1727204051.43031: variable 'port2_profile' from source: play vars 8975 1727204051.43045: variable 'omit' from source: magic vars 8975 1727204051.43153: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.43163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.43176: variable 'omit' from source: magic vars 8975 1727204051.43374: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.43383: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.43410: variable 'item' from source: unknown 8975 1727204051.43458: variable 'item' from source: unknown 8975 1727204051.43595: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.43599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.43602: variable 'omit' from source: magic vars 8975 1727204051.43698: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.43701: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.43727: variable 'item' from source: unknown 8975 1727204051.43776: variable 'item' from source: unknown 8975 1727204051.43854: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.43857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.43860: variable 'omit' from source: magic vars 8975 1727204051.43985: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.43991: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.44012: variable 'item' from source: unknown 8975 1727204051.44058: variable 'item' from source: unknown 8975 1727204051.44140: dumping result to json 8975 1727204051.44143: done dumping result, returning 8975 1727204051.44145: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' [127b8e07-fff9-9356-306d-000000000070] 8975 1727204051.44148: sending task result for task 127b8e07-fff9-9356-306d-000000000070 8975 1727204051.44192: done sending task result for task 127b8e07-fff9-9356-306d-000000000070 8975 1727204051.44194: WORKER PROCESS EXITING 8975 1727204051.44275: no more pending results, returning what we have 8975 1727204051.44279: in VariableManager get_vars() 8975 1727204051.44334: Calling all_inventory to load vars for managed-node2 8975 1727204051.44337: Calling groups_inventory to load vars for managed-node2 8975 1727204051.44339: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.44351: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.44354: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.44357: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.46232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.48711: done with get_vars() 8975 1727204051.48747: variable 'ansible_search_path' from source: unknown 8975 1727204051.48770: variable 'ansible_search_path' from source: unknown 8975 1727204051.48780: variable 'ansible_search_path' from source: unknown 8975 1727204051.48787: we have included files to process 8975 1727204051.48789: generating all_blocks data 8975 1727204051.48790: done generating all_blocks data 8975 1727204051.48803: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.48804: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.48807: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49052: in VariableManager get_vars() 8975 1727204051.49083: done with get_vars() 8975 1727204051.49301: done processing included file 8975 1727204051.49303: iterating over new_blocks loaded from include file 8975 1727204051.49304: in VariableManager get_vars() 8975 1727204051.49319: done with get_vars() 8975 1727204051.49320: filtering new block on tags 8975 1727204051.49338: done filtering new block on tags 8975 1727204051.49339: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0) 8975 1727204051.49345: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49346: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49349: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49443: in VariableManager get_vars() 8975 1727204051.49462: done with get_vars() 8975 1727204051.49641: done processing included file 8975 1727204051.49643: iterating over new_blocks loaded from include file 8975 1727204051.49644: in VariableManager get_vars() 8975 1727204051.49657: done with get_vars() 8975 1727204051.49658: filtering new block on tags 8975 1727204051.49676: done filtering new block on tags 8975 1727204051.49677: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.0) 8975 1727204051.49681: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49681: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49684: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 8975 1727204051.49824: in VariableManager get_vars() 8975 1727204051.49844: done with get_vars() 8975 1727204051.50013: done processing included file 8975 1727204051.50014: iterating over new_blocks loaded from include file 8975 1727204051.50015: in VariableManager get_vars() 8975 1727204051.50029: done with get_vars() 8975 1727204051.50030: filtering new block on tags 8975 1727204051.50044: done filtering new block on tags 8975 1727204051.50045: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.1) 8975 1727204051.50048: extending task lists for all hosts with included blocks 8975 1727204051.52243: done extending task lists 8975 1727204051.52251: done processing included files 8975 1727204051.52251: results queue empty 8975 1727204051.52252: checking for any_errors_fatal 8975 1727204051.52254: done checking for any_errors_fatal 8975 1727204051.52255: checking for max_fail_percentage 8975 1727204051.52256: done checking for max_fail_percentage 8975 1727204051.52256: checking to see if all hosts have failed and the running result is not ok 8975 1727204051.52257: done checking to see if all hosts have failed 8975 1727204051.52258: getting the remaining hosts for this loop 8975 1727204051.52259: done getting the remaining hosts for this loop 8975 1727204051.52260: getting the next task for host managed-node2 8975 1727204051.52263: done getting next task for host managed-node2 8975 1727204051.52267: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8975 1727204051.52269: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204051.52271: getting variables 8975 1727204051.52271: in VariableManager get_vars() 8975 1727204051.52286: Calling all_inventory to load vars for managed-node2 8975 1727204051.52288: Calling groups_inventory to load vars for managed-node2 8975 1727204051.52289: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.52296: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.52297: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.52299: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.53219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.54418: done with get_vars() 8975 1727204051.54447: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.124) 0:00:22.862 ***** 8975 1727204051.54517: entering _queue_task() for managed-node2/include_tasks 8975 1727204051.54817: worker is 1 (out of 1 available) 8975 1727204051.54834: exiting _queue_task() for managed-node2/include_tasks 8975 1727204051.54848: done queuing things up, now waiting for results queue to drain 8975 1727204051.54850: waiting for pending results... 8975 1727204051.55044: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 8975 1727204051.55126: in run() - task 127b8e07-fff9-9356-306d-000000000260 8975 1727204051.55139: variable 'ansible_search_path' from source: unknown 8975 1727204051.55142: variable 'ansible_search_path' from source: unknown 8975 1727204051.55178: calling self._execute() 8975 1727204051.55251: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.55256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.55267: variable 'omit' from source: magic vars 8975 1727204051.55588: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.55598: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.55604: _execute() done 8975 1727204051.55607: dumping result to json 8975 1727204051.55611: done dumping result, returning 8975 1727204051.55618: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-9356-306d-000000000260] 8975 1727204051.55635: sending task result for task 127b8e07-fff9-9356-306d-000000000260 8975 1727204051.55726: done sending task result for task 127b8e07-fff9-9356-306d-000000000260 8975 1727204051.55731: WORKER PROCESS EXITING 8975 1727204051.55761: no more pending results, returning what we have 8975 1727204051.55768: in VariableManager get_vars() 8975 1727204051.55818: Calling all_inventory to load vars for managed-node2 8975 1727204051.55821: Calling groups_inventory to load vars for managed-node2 8975 1727204051.55823: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.55840: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.55843: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.55846: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.56892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.58340: done with get_vars() 8975 1727204051.58363: variable 'ansible_search_path' from source: unknown 8975 1727204051.58365: variable 'ansible_search_path' from source: unknown 8975 1727204051.58412: we have included files to process 8975 1727204051.58413: generating all_blocks data 8975 1727204051.58414: done generating all_blocks data 8975 1727204051.58416: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204051.58417: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204051.58419: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204051.59442: done processing included file 8975 1727204051.59444: iterating over new_blocks loaded from include file 8975 1727204051.59446: in VariableManager get_vars() 8975 1727204051.59471: done with get_vars() 8975 1727204051.59473: filtering new block on tags 8975 1727204051.59498: done filtering new block on tags 8975 1727204051.59501: in VariableManager get_vars() 8975 1727204051.59520: done with get_vars() 8975 1727204051.59521: filtering new block on tags 8975 1727204051.59544: done filtering new block on tags 8975 1727204051.59546: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 8975 1727204051.59551: extending task lists for all hosts with included blocks 8975 1727204051.59726: done extending task lists 8975 1727204051.59729: done processing included files 8975 1727204051.59730: results queue empty 8975 1727204051.59731: checking for any_errors_fatal 8975 1727204051.59734: done checking for any_errors_fatal 8975 1727204051.59735: checking for max_fail_percentage 8975 1727204051.59736: done checking for max_fail_percentage 8975 1727204051.59737: checking to see if all hosts have failed and the running result is not ok 8975 1727204051.59738: done checking to see if all hosts have failed 8975 1727204051.59738: getting the remaining hosts for this loop 8975 1727204051.59740: done getting the remaining hosts for this loop 8975 1727204051.59742: getting the next task for host managed-node2 8975 1727204051.59746: done getting next task for host managed-node2 8975 1727204051.59748: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204051.59751: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204051.59753: getting variables 8975 1727204051.59754: in VariableManager get_vars() 8975 1727204051.59846: Calling all_inventory to load vars for managed-node2 8975 1727204051.59849: Calling groups_inventory to load vars for managed-node2 8975 1727204051.59851: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.59857: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.59860: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.59862: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.60817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.62019: done with get_vars() 8975 1727204051.62049: done getting variables 8975 1727204051.62093: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.075) 0:00:22.938 ***** 8975 1727204051.62117: entering _queue_task() for managed-node2/set_fact 8975 1727204051.62415: worker is 1 (out of 1 available) 8975 1727204051.62431: exiting _queue_task() for managed-node2/set_fact 8975 1727204051.62446: done queuing things up, now waiting for results queue to drain 8975 1727204051.62448: waiting for pending results... 8975 1727204051.62639: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204051.62725: in run() - task 127b8e07-fff9-9356-306d-0000000003b3 8975 1727204051.62738: variable 'ansible_search_path' from source: unknown 8975 1727204051.62742: variable 'ansible_search_path' from source: unknown 8975 1727204051.62778: calling self._execute() 8975 1727204051.62856: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.62860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.62871: variable 'omit' from source: magic vars 8975 1727204051.63181: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.63191: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.63196: variable 'omit' from source: magic vars 8975 1727204051.63241: variable 'omit' from source: magic vars 8975 1727204051.63270: variable 'omit' from source: magic vars 8975 1727204051.63307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204051.63341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204051.63359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204051.63377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.63387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.63414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204051.63417: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.63420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.63500: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204051.63504: Set connection var ansible_connection to ssh 8975 1727204051.63507: Set connection var ansible_shell_executable to /bin/sh 8975 1727204051.63513: Set connection var ansible_timeout to 10 8975 1727204051.63515: Set connection var ansible_shell_type to sh 8975 1727204051.63525: Set connection var ansible_pipelining to False 8975 1727204051.63547: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.63552: variable 'ansible_connection' from source: unknown 8975 1727204051.63555: variable 'ansible_module_compression' from source: unknown 8975 1727204051.63558: variable 'ansible_shell_type' from source: unknown 8975 1727204051.63560: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.63563: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.63567: variable 'ansible_pipelining' from source: unknown 8975 1727204051.63570: variable 'ansible_timeout' from source: unknown 8975 1727204051.63572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.63691: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204051.63696: variable 'omit' from source: magic vars 8975 1727204051.63702: starting attempt loop 8975 1727204051.63705: running the handler 8975 1727204051.63717: handler run complete 8975 1727204051.63725: attempt loop complete, returning result 8975 1727204051.63730: _execute() done 8975 1727204051.63733: dumping result to json 8975 1727204051.63735: done dumping result, returning 8975 1727204051.63741: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-9356-306d-0000000003b3] 8975 1727204051.63747: sending task result for task 127b8e07-fff9-9356-306d-0000000003b3 8975 1727204051.63849: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b3 8975 1727204051.63852: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8975 1727204051.63929: no more pending results, returning what we have 8975 1727204051.63934: results queue empty 8975 1727204051.63935: checking for any_errors_fatal 8975 1727204051.63937: done checking for any_errors_fatal 8975 1727204051.63937: checking for max_fail_percentage 8975 1727204051.63939: done checking for max_fail_percentage 8975 1727204051.63940: checking to see if all hosts have failed and the running result is not ok 8975 1727204051.63941: done checking to see if all hosts have failed 8975 1727204051.63942: getting the remaining hosts for this loop 8975 1727204051.63944: done getting the remaining hosts for this loop 8975 1727204051.63948: getting the next task for host managed-node2 8975 1727204051.63956: done getting next task for host managed-node2 8975 1727204051.63959: ^ task is: TASK: Stat profile file 8975 1727204051.63962: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204051.63968: getting variables 8975 1727204051.63969: in VariableManager get_vars() 8975 1727204051.64018: Calling all_inventory to load vars for managed-node2 8975 1727204051.64021: Calling groups_inventory to load vars for managed-node2 8975 1727204051.64023: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204051.64035: Calling all_plugins_play to load vars for managed-node2 8975 1727204051.64038: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204051.64041: Calling groups_plugins_play to load vars for managed-node2 8975 1727204051.65133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204051.66321: done with get_vars() 8975 1727204051.66349: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.043) 0:00:22.981 ***** 8975 1727204051.66432: entering _queue_task() for managed-node2/stat 8975 1727204051.66724: worker is 1 (out of 1 available) 8975 1727204051.66739: exiting _queue_task() for managed-node2/stat 8975 1727204051.66751: done queuing things up, now waiting for results queue to drain 8975 1727204051.66753: waiting for pending results... 8975 1727204051.66947: running TaskExecutor() for managed-node2/TASK: Stat profile file 8975 1727204051.67039: in run() - task 127b8e07-fff9-9356-306d-0000000003b4 8975 1727204051.67052: variable 'ansible_search_path' from source: unknown 8975 1727204051.67056: variable 'ansible_search_path' from source: unknown 8975 1727204051.67092: calling self._execute() 8975 1727204051.67186: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.67190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.67200: variable 'omit' from source: magic vars 8975 1727204051.67516: variable 'ansible_distribution_major_version' from source: facts 8975 1727204051.67526: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204051.67540: variable 'omit' from source: magic vars 8975 1727204051.67573: variable 'omit' from source: magic vars 8975 1727204051.67654: variable 'profile' from source: include params 8975 1727204051.67658: variable 'item' from source: include params 8975 1727204051.67709: variable 'item' from source: include params 8975 1727204051.67724: variable 'omit' from source: magic vars 8975 1727204051.67769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204051.67800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204051.67819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204051.67837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.67848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204051.67878: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204051.67881: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.67883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.67961: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204051.67964: Set connection var ansible_connection to ssh 8975 1727204051.67967: Set connection var ansible_shell_executable to /bin/sh 8975 1727204051.67981: Set connection var ansible_timeout to 10 8975 1727204051.67984: Set connection var ansible_shell_type to sh 8975 1727204051.67990: Set connection var ansible_pipelining to False 8975 1727204051.68010: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.68013: variable 'ansible_connection' from source: unknown 8975 1727204051.68016: variable 'ansible_module_compression' from source: unknown 8975 1727204051.68018: variable 'ansible_shell_type' from source: unknown 8975 1727204051.68021: variable 'ansible_shell_executable' from source: unknown 8975 1727204051.68024: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204051.68029: variable 'ansible_pipelining' from source: unknown 8975 1727204051.68035: variable 'ansible_timeout' from source: unknown 8975 1727204051.68039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204051.68215: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204051.68224: variable 'omit' from source: magic vars 8975 1727204051.68229: starting attempt loop 8975 1727204051.68235: running the handler 8975 1727204051.68248: _low_level_execute_command(): starting 8975 1727204051.68255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204051.68824: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.68834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.68837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.68889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204051.68893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.68987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.70742: stdout chunk (state=3): >>>/root <<< 8975 1727204051.70892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.70920: stderr chunk (state=3): >>><<< 8975 1727204051.70924: stdout chunk (state=3): >>><<< 8975 1727204051.70949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.70960: _low_level_execute_command(): starting 8975 1727204051.70969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082 `" && echo ansible-tmp-1727204051.709483-11190-244269062708082="` echo /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082 `" ) && sleep 0' 8975 1727204051.71453: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.71487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.71499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.71502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.71543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204051.71547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.71559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.71636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.73612: stdout chunk (state=3): >>>ansible-tmp-1727204051.709483-11190-244269062708082=/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082 <<< 8975 1727204051.73716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.73795: stderr chunk (state=3): >>><<< 8975 1727204051.73799: stdout chunk (state=3): >>><<< 8975 1727204051.73810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204051.709483-11190-244269062708082=/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.73863: variable 'ansible_module_compression' from source: unknown 8975 1727204051.73907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204051.73948: variable 'ansible_facts' from source: unknown 8975 1727204051.74006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py 8975 1727204051.74138: Sending initial data 8975 1727204051.74142: Sent initial data (151 bytes) 8975 1727204051.74674: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.74678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.74681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.74683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.74738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204051.74741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.74819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.76405: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204051.76469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204051.76574: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpmg_5r48e /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py <<< 8975 1727204051.76578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py" <<< 8975 1727204051.76636: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpmg_5r48e" to remote "/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py" <<< 8975 1727204051.77574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.77579: stderr chunk (state=3): >>><<< 8975 1727204051.77581: stdout chunk (state=3): >>><<< 8975 1727204051.77590: done transferring module to remote 8975 1727204051.77673: _low_level_execute_command(): starting 8975 1727204051.77676: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/ /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py && sleep 0' 8975 1727204051.78345: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204051.78364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204051.78443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.78515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204051.78541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.78598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.78683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.80615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204051.80620: stdout chunk (state=3): >>><<< 8975 1727204051.80622: stderr chunk (state=3): >>><<< 8975 1727204051.80642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204051.80739: _low_level_execute_command(): starting 8975 1727204051.80743: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/AnsiballZ_stat.py && sleep 0' 8975 1727204051.81327: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204051.81347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204051.81362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204051.81386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204051.81406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204051.81414: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204051.81424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204051.81525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204051.81546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204051.81654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204051.98247: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204051.99606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204051.99669: stderr chunk (state=3): >>><<< 8975 1727204051.99675: stdout chunk (state=3): >>><<< 8975 1727204051.99691: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204051.99716: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204051.99727: _low_level_execute_command(): starting 8975 1727204051.99734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204051.709483-11190-244269062708082/ > /dev/null 2>&1 && sleep 0' 8975 1727204052.00236: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.00240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.00243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.00245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204052.00247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.00304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204052.00307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.00314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.00388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.02346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.02406: stderr chunk (state=3): >>><<< 8975 1727204052.02412: stdout chunk (state=3): >>><<< 8975 1727204052.02426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204052.02434: handler run complete 8975 1727204052.02452: attempt loop complete, returning result 8975 1727204052.02455: _execute() done 8975 1727204052.02458: dumping result to json 8975 1727204052.02462: done dumping result, returning 8975 1727204052.02472: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-9356-306d-0000000003b4] 8975 1727204052.02477: sending task result for task 127b8e07-fff9-9356-306d-0000000003b4 8975 1727204052.02586: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b4 8975 1727204052.02589: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 8975 1727204052.02647: no more pending results, returning what we have 8975 1727204052.02650: results queue empty 8975 1727204052.02651: checking for any_errors_fatal 8975 1727204052.02658: done checking for any_errors_fatal 8975 1727204052.02659: checking for max_fail_percentage 8975 1727204052.02661: done checking for max_fail_percentage 8975 1727204052.02662: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.02663: done checking to see if all hosts have failed 8975 1727204052.02664: getting the remaining hosts for this loop 8975 1727204052.02668: done getting the remaining hosts for this loop 8975 1727204052.02672: getting the next task for host managed-node2 8975 1727204052.02681: done getting next task for host managed-node2 8975 1727204052.02683: ^ task is: TASK: Set NM profile exist flag based on the profile files 8975 1727204052.02687: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.02692: getting variables 8975 1727204052.02693: in VariableManager get_vars() 8975 1727204052.02738: Calling all_inventory to load vars for managed-node2 8975 1727204052.02742: Calling groups_inventory to load vars for managed-node2 8975 1727204052.02744: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.02755: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.02758: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.02761: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.03910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.05121: done with get_vars() 8975 1727204052.05152: done getting variables 8975 1727204052.05206: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.387) 0:00:23.369 ***** 8975 1727204052.05234: entering _queue_task() for managed-node2/set_fact 8975 1727204052.05531: worker is 1 (out of 1 available) 8975 1727204052.05546: exiting _queue_task() for managed-node2/set_fact 8975 1727204052.05559: done queuing things up, now waiting for results queue to drain 8975 1727204052.05561: waiting for pending results... 8975 1727204052.05753: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 8975 1727204052.05839: in run() - task 127b8e07-fff9-9356-306d-0000000003b5 8975 1727204052.05852: variable 'ansible_search_path' from source: unknown 8975 1727204052.05856: variable 'ansible_search_path' from source: unknown 8975 1727204052.05888: calling self._execute() 8975 1727204052.05971: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.05976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.05985: variable 'omit' from source: magic vars 8975 1727204052.06288: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.06298: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.06393: variable 'profile_stat' from source: set_fact 8975 1727204052.06407: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204052.06411: when evaluation is False, skipping this task 8975 1727204052.06414: _execute() done 8975 1727204052.06416: dumping result to json 8975 1727204052.06419: done dumping result, returning 8975 1727204052.06426: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-9356-306d-0000000003b5] 8975 1727204052.06431: sending task result for task 127b8e07-fff9-9356-306d-0000000003b5 8975 1727204052.06533: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b5 8975 1727204052.06536: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204052.06601: no more pending results, returning what we have 8975 1727204052.06605: results queue empty 8975 1727204052.06606: checking for any_errors_fatal 8975 1727204052.06617: done checking for any_errors_fatal 8975 1727204052.06618: checking for max_fail_percentage 8975 1727204052.06620: done checking for max_fail_percentage 8975 1727204052.06621: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.06622: done checking to see if all hosts have failed 8975 1727204052.06623: getting the remaining hosts for this loop 8975 1727204052.06625: done getting the remaining hosts for this loop 8975 1727204052.06632: getting the next task for host managed-node2 8975 1727204052.06640: done getting next task for host managed-node2 8975 1727204052.06643: ^ task is: TASK: Get NM profile info 8975 1727204052.06647: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.06651: getting variables 8975 1727204052.06652: in VariableManager get_vars() 8975 1727204052.06696: Calling all_inventory to load vars for managed-node2 8975 1727204052.06699: Calling groups_inventory to load vars for managed-node2 8975 1727204052.06701: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.06712: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.06714: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.06717: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.07707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.08894: done with get_vars() 8975 1727204052.08923: done getting variables 8975 1727204052.08979: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.037) 0:00:23.406 ***** 8975 1727204052.09005: entering _queue_task() for managed-node2/shell 8975 1727204052.09312: worker is 1 (out of 1 available) 8975 1727204052.09332: exiting _queue_task() for managed-node2/shell 8975 1727204052.09347: done queuing things up, now waiting for results queue to drain 8975 1727204052.09349: waiting for pending results... 8975 1727204052.09786: running TaskExecutor() for managed-node2/TASK: Get NM profile info 8975 1727204052.09831: in run() - task 127b8e07-fff9-9356-306d-0000000003b6 8975 1727204052.09857: variable 'ansible_search_path' from source: unknown 8975 1727204052.09870: variable 'ansible_search_path' from source: unknown 8975 1727204052.09924: calling self._execute() 8975 1727204052.10050: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.10067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.10087: variable 'omit' from source: magic vars 8975 1727204052.10544: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.10672: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.10679: variable 'omit' from source: magic vars 8975 1727204052.10683: variable 'omit' from source: magic vars 8975 1727204052.10768: variable 'profile' from source: include params 8975 1727204052.10787: variable 'item' from source: include params 8975 1727204052.10870: variable 'item' from source: include params 8975 1727204052.10901: variable 'omit' from source: magic vars 8975 1727204052.10954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204052.11007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204052.11038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204052.11064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.11088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.11135: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204052.11145: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.11153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.11279: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204052.11289: Set connection var ansible_connection to ssh 8975 1727204052.11329: Set connection var ansible_shell_executable to /bin/sh 8975 1727204052.11334: Set connection var ansible_timeout to 10 8975 1727204052.11336: Set connection var ansible_shell_type to sh 8975 1727204052.11339: Set connection var ansible_pipelining to False 8975 1727204052.11372: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.11437: variable 'ansible_connection' from source: unknown 8975 1727204052.11441: variable 'ansible_module_compression' from source: unknown 8975 1727204052.11443: variable 'ansible_shell_type' from source: unknown 8975 1727204052.11446: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.11449: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.11451: variable 'ansible_pipelining' from source: unknown 8975 1727204052.11454: variable 'ansible_timeout' from source: unknown 8975 1727204052.11456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.11606: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204052.11626: variable 'omit' from source: magic vars 8975 1727204052.11643: starting attempt loop 8975 1727204052.11657: running the handler 8975 1727204052.11679: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204052.11763: _low_level_execute_command(): starting 8975 1727204052.11768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204052.12571: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204052.12594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.12649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.12716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204052.12751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.12776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.12986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.14580: stdout chunk (state=3): >>>/root <<< 8975 1727204052.14686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.14749: stderr chunk (state=3): >>><<< 8975 1727204052.14753: stdout chunk (state=3): >>><<< 8975 1727204052.14779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204052.14797: _low_level_execute_command(): starting 8975 1727204052.14804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410 `" && echo ansible-tmp-1727204052.147823-11249-33821512622410="` echo /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410 `" ) && sleep 0' 8975 1727204052.15313: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.15327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.15330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.15334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.15384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204052.15387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.15394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.15470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.17428: stdout chunk (state=3): >>>ansible-tmp-1727204052.147823-11249-33821512622410=/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410 <<< 8975 1727204052.17575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.17602: stderr chunk (state=3): >>><<< 8975 1727204052.17605: stdout chunk (state=3): >>><<< 8975 1727204052.17622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.147823-11249-33821512622410=/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204052.17654: variable 'ansible_module_compression' from source: unknown 8975 1727204052.17699: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204052.17740: variable 'ansible_facts' from source: unknown 8975 1727204052.17795: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py 8975 1727204052.17915: Sending initial data 8975 1727204052.17919: Sent initial data (153 bytes) 8975 1727204052.18428: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.18432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.18435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.18437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.18493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204052.18497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.18574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.20182: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204052.20247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204052.20320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpr13to9fi /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py <<< 8975 1727204052.20323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py" <<< 8975 1727204052.20387: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpr13to9fi" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py" <<< 8975 1727204052.21045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.21121: stderr chunk (state=3): >>><<< 8975 1727204052.21125: stdout chunk (state=3): >>><<< 8975 1727204052.21147: done transferring module to remote 8975 1727204052.21158: _low_level_execute_command(): starting 8975 1727204052.21166: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/ /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py && sleep 0' 8975 1727204052.21642: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.21647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204052.21683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.21686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.21693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204052.21696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.21743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.21755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.21833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.23656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.23715: stderr chunk (state=3): >>><<< 8975 1727204052.23719: stdout chunk (state=3): >>><<< 8975 1727204052.23733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204052.23737: _low_level_execute_command(): starting 8975 1727204052.23743: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/AnsiballZ_command.py && sleep 0' 8975 1727204052.24235: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204052.24240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.24253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.24317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204052.24320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.24328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.24401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.52841: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:12.407273", "end": "2024-09-24 14:54:12.526626", "delta": "0:00:00.119353", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204052.54501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204052.54564: stderr chunk (state=3): >>><<< 8975 1727204052.54568: stdout chunk (state=3): >>><<< 8975 1727204052.54586: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:12.407273", "end": "2024-09-24 14:54:12.526626", "delta": "0:00:00.119353", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204052.54617: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204052.54632: _low_level_execute_command(): starting 8975 1727204052.54635: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.147823-11249-33821512622410/ > /dev/null 2>&1 && sleep 0' 8975 1727204052.55198: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.55202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.55208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204052.55214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204052.55217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204052.55219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204052.55238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204052.55399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204052.57405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204052.57410: stdout chunk (state=3): >>><<< 8975 1727204052.57413: stderr chunk (state=3): >>><<< 8975 1727204052.57442: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204052.57449: handler run complete 8975 1727204052.57478: Evaluated conditional (False): False 8975 1727204052.57488: attempt loop complete, returning result 8975 1727204052.57491: _execute() done 8975 1727204052.57494: dumping result to json 8975 1727204052.57571: done dumping result, returning 8975 1727204052.57574: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-9356-306d-0000000003b6] 8975 1727204052.57576: sending task result for task 127b8e07-fff9-9356-306d-0000000003b6 ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.119353", "end": "2024-09-24 14:54:12.526626", "rc": 0, "start": "2024-09-24 14:54:12.407273" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 8975 1727204052.57735: no more pending results, returning what we have 8975 1727204052.57739: results queue empty 8975 1727204052.57740: checking for any_errors_fatal 8975 1727204052.57748: done checking for any_errors_fatal 8975 1727204052.57749: checking for max_fail_percentage 8975 1727204052.57751: done checking for max_fail_percentage 8975 1727204052.57752: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.57753: done checking to see if all hosts have failed 8975 1727204052.57754: getting the remaining hosts for this loop 8975 1727204052.57756: done getting the remaining hosts for this loop 8975 1727204052.57760: getting the next task for host managed-node2 8975 1727204052.57771: done getting next task for host managed-node2 8975 1727204052.57774: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204052.57778: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.57782: getting variables 8975 1727204052.57784: in VariableManager get_vars() 8975 1727204052.57832: Calling all_inventory to load vars for managed-node2 8975 1727204052.57836: Calling groups_inventory to load vars for managed-node2 8975 1727204052.57838: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.57852: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.57855: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.57858: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.58632: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b6 8975 1727204052.58638: WORKER PROCESS EXITING 8975 1727204052.60064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.62245: done with get_vars() 8975 1727204052.62285: done getting variables 8975 1727204052.62357: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.533) 0:00:23.940 ***** 8975 1727204052.62394: entering _queue_task() for managed-node2/set_fact 8975 1727204052.62800: worker is 1 (out of 1 available) 8975 1727204052.62816: exiting _queue_task() for managed-node2/set_fact 8975 1727204052.62833: done queuing things up, now waiting for results queue to drain 8975 1727204052.62835: waiting for pending results... 8975 1727204052.63288: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204052.63293: in run() - task 127b8e07-fff9-9356-306d-0000000003b7 8975 1727204052.63305: variable 'ansible_search_path' from source: unknown 8975 1727204052.63313: variable 'ansible_search_path' from source: unknown 8975 1727204052.63360: calling self._execute() 8975 1727204052.63470: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.63482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.63495: variable 'omit' from source: magic vars 8975 1727204052.63917: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.63938: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.64094: variable 'nm_profile_exists' from source: set_fact 8975 1727204052.64119: Evaluated conditional (nm_profile_exists.rc == 0): True 8975 1727204052.64133: variable 'omit' from source: magic vars 8975 1727204052.64271: variable 'omit' from source: magic vars 8975 1727204052.64274: variable 'omit' from source: magic vars 8975 1727204052.64284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204052.64336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204052.64364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204052.64394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.64412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.64452: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204052.64460: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.64470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.64588: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204052.64596: Set connection var ansible_connection to ssh 8975 1727204052.64611: Set connection var ansible_shell_executable to /bin/sh 8975 1727204052.64621: Set connection var ansible_timeout to 10 8975 1727204052.64629: Set connection var ansible_shell_type to sh 8975 1727204052.64647: Set connection var ansible_pipelining to False 8975 1727204052.64714: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.64718: variable 'ansible_connection' from source: unknown 8975 1727204052.64720: variable 'ansible_module_compression' from source: unknown 8975 1727204052.64722: variable 'ansible_shell_type' from source: unknown 8975 1727204052.64724: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.64726: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.64730: variable 'ansible_pipelining' from source: unknown 8975 1727204052.64733: variable 'ansible_timeout' from source: unknown 8975 1727204052.64735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.64897: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204052.64913: variable 'omit' from source: magic vars 8975 1727204052.64932: starting attempt loop 8975 1727204052.64935: running the handler 8975 1727204052.65041: handler run complete 8975 1727204052.65045: attempt loop complete, returning result 8975 1727204052.65047: _execute() done 8975 1727204052.65051: dumping result to json 8975 1727204052.65053: done dumping result, returning 8975 1727204052.65056: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-9356-306d-0000000003b7] 8975 1727204052.65058: sending task result for task 127b8e07-fff9-9356-306d-0000000003b7 8975 1727204052.65373: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b7 8975 1727204052.65377: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8975 1727204052.65427: no more pending results, returning what we have 8975 1727204052.65432: results queue empty 8975 1727204052.65433: checking for any_errors_fatal 8975 1727204052.65440: done checking for any_errors_fatal 8975 1727204052.65441: checking for max_fail_percentage 8975 1727204052.65442: done checking for max_fail_percentage 8975 1727204052.65443: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.65444: done checking to see if all hosts have failed 8975 1727204052.65445: getting the remaining hosts for this loop 8975 1727204052.65448: done getting the remaining hosts for this loop 8975 1727204052.65452: getting the next task for host managed-node2 8975 1727204052.65462: done getting next task for host managed-node2 8975 1727204052.65467: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204052.65471: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.65476: getting variables 8975 1727204052.65477: in VariableManager get_vars() 8975 1727204052.65519: Calling all_inventory to load vars for managed-node2 8975 1727204052.65522: Calling groups_inventory to load vars for managed-node2 8975 1727204052.65524: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.65537: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.65540: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.65544: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.67459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.69651: done with get_vars() 8975 1727204052.69692: done getting variables 8975 1727204052.69763: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204052.69896: variable 'profile' from source: include params 8975 1727204052.69901: variable 'item' from source: include params 8975 1727204052.69971: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.076) 0:00:24.017 ***** 8975 1727204052.70013: entering _queue_task() for managed-node2/command 8975 1727204052.70394: worker is 1 (out of 1 available) 8975 1727204052.70408: exiting _queue_task() for managed-node2/command 8975 1727204052.70423: done queuing things up, now waiting for results queue to drain 8975 1727204052.70424: waiting for pending results... 8975 1727204052.70736: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 8975 1727204052.70877: in run() - task 127b8e07-fff9-9356-306d-0000000003b9 8975 1727204052.70903: variable 'ansible_search_path' from source: unknown 8975 1727204052.70913: variable 'ansible_search_path' from source: unknown 8975 1727204052.70959: calling self._execute() 8975 1727204052.71069: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.71083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.71099: variable 'omit' from source: magic vars 8975 1727204052.71519: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.71548: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.71693: variable 'profile_stat' from source: set_fact 8975 1727204052.71715: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204052.71723: when evaluation is False, skipping this task 8975 1727204052.71733: _execute() done 8975 1727204052.71742: dumping result to json 8975 1727204052.71748: done dumping result, returning 8975 1727204052.71764: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [127b8e07-fff9-9356-306d-0000000003b9] 8975 1727204052.71778: sending task result for task 127b8e07-fff9-9356-306d-0000000003b9 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204052.72023: no more pending results, returning what we have 8975 1727204052.72031: results queue empty 8975 1727204052.72032: checking for any_errors_fatal 8975 1727204052.72041: done checking for any_errors_fatal 8975 1727204052.72042: checking for max_fail_percentage 8975 1727204052.72044: done checking for max_fail_percentage 8975 1727204052.72045: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.72047: done checking to see if all hosts have failed 8975 1727204052.72048: getting the remaining hosts for this loop 8975 1727204052.72051: done getting the remaining hosts for this loop 8975 1727204052.72056: getting the next task for host managed-node2 8975 1727204052.72068: done getting next task for host managed-node2 8975 1727204052.72071: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204052.72077: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.72081: getting variables 8975 1727204052.72083: in VariableManager get_vars() 8975 1727204052.72140: Calling all_inventory to load vars for managed-node2 8975 1727204052.72143: Calling groups_inventory to load vars for managed-node2 8975 1727204052.72146: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.72163: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.72371: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.72377: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.73085: done sending task result for task 127b8e07-fff9-9356-306d-0000000003b9 8975 1727204052.73089: WORKER PROCESS EXITING 8975 1727204052.74137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.76324: done with get_vars() 8975 1727204052.76368: done getting variables 8975 1727204052.76436: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204052.76560: variable 'profile' from source: include params 8975 1727204052.76566: variable 'item' from source: include params 8975 1727204052.76635: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.066) 0:00:24.083 ***** 8975 1727204052.76672: entering _queue_task() for managed-node2/set_fact 8975 1727204052.77062: worker is 1 (out of 1 available) 8975 1727204052.77277: exiting _queue_task() for managed-node2/set_fact 8975 1727204052.77288: done queuing things up, now waiting for results queue to drain 8975 1727204052.77290: waiting for pending results... 8975 1727204052.77401: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 8975 1727204052.77553: in run() - task 127b8e07-fff9-9356-306d-0000000003ba 8975 1727204052.77579: variable 'ansible_search_path' from source: unknown 8975 1727204052.77587: variable 'ansible_search_path' from source: unknown 8975 1727204052.77640: calling self._execute() 8975 1727204052.77750: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.77764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.77785: variable 'omit' from source: magic vars 8975 1727204052.78211: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.78232: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.78381: variable 'profile_stat' from source: set_fact 8975 1727204052.78407: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204052.78415: when evaluation is False, skipping this task 8975 1727204052.78423: _execute() done 8975 1727204052.78434: dumping result to json 8975 1727204052.78442: done dumping result, returning 8975 1727204052.78454: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [127b8e07-fff9-9356-306d-0000000003ba] 8975 1727204052.78465: sending task result for task 127b8e07-fff9-9356-306d-0000000003ba skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204052.78657: no more pending results, returning what we have 8975 1727204052.78661: results queue empty 8975 1727204052.78662: checking for any_errors_fatal 8975 1727204052.78672: done checking for any_errors_fatal 8975 1727204052.78673: checking for max_fail_percentage 8975 1727204052.78675: done checking for max_fail_percentage 8975 1727204052.78676: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.78677: done checking to see if all hosts have failed 8975 1727204052.78678: getting the remaining hosts for this loop 8975 1727204052.78681: done getting the remaining hosts for this loop 8975 1727204052.78685: getting the next task for host managed-node2 8975 1727204052.78695: done getting next task for host managed-node2 8975 1727204052.78698: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8975 1727204052.78702: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.78707: getting variables 8975 1727204052.78710: in VariableManager get_vars() 8975 1727204052.78763: Calling all_inventory to load vars for managed-node2 8975 1727204052.78871: Calling groups_inventory to load vars for managed-node2 8975 1727204052.78874: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.78882: done sending task result for task 127b8e07-fff9-9356-306d-0000000003ba 8975 1727204052.78885: WORKER PROCESS EXITING 8975 1727204052.78900: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.78903: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.78907: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.80996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.83191: done with get_vars() 8975 1727204052.83236: done getting variables 8975 1727204052.83305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204052.83434: variable 'profile' from source: include params 8975 1727204052.83439: variable 'item' from source: include params 8975 1727204052.83503: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.068) 0:00:24.152 ***** 8975 1727204052.83539: entering _queue_task() for managed-node2/command 8975 1727204052.83931: worker is 1 (out of 1 available) 8975 1727204052.83945: exiting _queue_task() for managed-node2/command 8975 1727204052.83962: done queuing things up, now waiting for results queue to drain 8975 1727204052.83963: waiting for pending results... 8975 1727204052.84274: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 8975 1727204052.84416: in run() - task 127b8e07-fff9-9356-306d-0000000003bb 8975 1727204052.84442: variable 'ansible_search_path' from source: unknown 8975 1727204052.84450: variable 'ansible_search_path' from source: unknown 8975 1727204052.84501: calling self._execute() 8975 1727204052.84614: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.84718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.84722: variable 'omit' from source: magic vars 8975 1727204052.85074: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.85094: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.85242: variable 'profile_stat' from source: set_fact 8975 1727204052.85270: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204052.85278: when evaluation is False, skipping this task 8975 1727204052.85285: _execute() done 8975 1727204052.85293: dumping result to json 8975 1727204052.85299: done dumping result, returning 8975 1727204052.85311: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 [127b8e07-fff9-9356-306d-0000000003bb] 8975 1727204052.85323: sending task result for task 127b8e07-fff9-9356-306d-0000000003bb 8975 1727204052.85671: done sending task result for task 127b8e07-fff9-9356-306d-0000000003bb 8975 1727204052.85675: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204052.85723: no more pending results, returning what we have 8975 1727204052.85730: results queue empty 8975 1727204052.85731: checking for any_errors_fatal 8975 1727204052.85737: done checking for any_errors_fatal 8975 1727204052.85738: checking for max_fail_percentage 8975 1727204052.85740: done checking for max_fail_percentage 8975 1727204052.85741: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.85742: done checking to see if all hosts have failed 8975 1727204052.85743: getting the remaining hosts for this loop 8975 1727204052.85744: done getting the remaining hosts for this loop 8975 1727204052.85749: getting the next task for host managed-node2 8975 1727204052.85755: done getting next task for host managed-node2 8975 1727204052.85758: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8975 1727204052.85763: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.85769: getting variables 8975 1727204052.85770: in VariableManager get_vars() 8975 1727204052.85812: Calling all_inventory to load vars for managed-node2 8975 1727204052.85815: Calling groups_inventory to load vars for managed-node2 8975 1727204052.85818: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.85834: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.85838: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.85842: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.87780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.89919: done with get_vars() 8975 1727204052.89962: done getting variables 8975 1727204052.90036: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204052.90163: variable 'profile' from source: include params 8975 1727204052.90179: variable 'item' from source: include params 8975 1727204052.90254: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.067) 0:00:24.220 ***** 8975 1727204052.90309: entering _queue_task() for managed-node2/set_fact 8975 1727204052.90907: worker is 1 (out of 1 available) 8975 1727204052.90919: exiting _queue_task() for managed-node2/set_fact 8975 1727204052.90934: done queuing things up, now waiting for results queue to drain 8975 1727204052.90936: waiting for pending results... 8975 1727204052.91085: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 8975 1727204052.91224: in run() - task 127b8e07-fff9-9356-306d-0000000003bc 8975 1727204052.91249: variable 'ansible_search_path' from source: unknown 8975 1727204052.91256: variable 'ansible_search_path' from source: unknown 8975 1727204052.91307: calling self._execute() 8975 1727204052.91421: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.91447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.91462: variable 'omit' from source: magic vars 8975 1727204052.91871: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.91883: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.91988: variable 'profile_stat' from source: set_fact 8975 1727204052.92001: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204052.92005: when evaluation is False, skipping this task 8975 1727204052.92009: _execute() done 8975 1727204052.92012: dumping result to json 8975 1727204052.92015: done dumping result, returning 8975 1727204052.92023: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [127b8e07-fff9-9356-306d-0000000003bc] 8975 1727204052.92029: sending task result for task 127b8e07-fff9-9356-306d-0000000003bc 8975 1727204052.92129: done sending task result for task 127b8e07-fff9-9356-306d-0000000003bc 8975 1727204052.92132: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204052.92190: no more pending results, returning what we have 8975 1727204052.92194: results queue empty 8975 1727204052.92195: checking for any_errors_fatal 8975 1727204052.92202: done checking for any_errors_fatal 8975 1727204052.92203: checking for max_fail_percentage 8975 1727204052.92205: done checking for max_fail_percentage 8975 1727204052.92206: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.92208: done checking to see if all hosts have failed 8975 1727204052.92208: getting the remaining hosts for this loop 8975 1727204052.92210: done getting the remaining hosts for this loop 8975 1727204052.92214: getting the next task for host managed-node2 8975 1727204052.92224: done getting next task for host managed-node2 8975 1727204052.92227: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8975 1727204052.92230: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.92235: getting variables 8975 1727204052.92237: in VariableManager get_vars() 8975 1727204052.92284: Calling all_inventory to load vars for managed-node2 8975 1727204052.92286: Calling groups_inventory to load vars for managed-node2 8975 1727204052.92288: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.92300: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.92302: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.92305: Calling groups_plugins_play to load vars for managed-node2 8975 1727204052.93301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204052.95171: done with get_vars() 8975 1727204052.95206: done getting variables 8975 1727204052.95270: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204052.95483: variable 'profile' from source: include params 8975 1727204052.95487: variable 'item' from source: include params 8975 1727204052.95555: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.052) 0:00:24.272 ***** 8975 1727204052.95599: entering _queue_task() for managed-node2/assert 8975 1727204052.96023: worker is 1 (out of 1 available) 8975 1727204052.96042: exiting _queue_task() for managed-node2/assert 8975 1727204052.96056: done queuing things up, now waiting for results queue to drain 8975 1727204052.96058: waiting for pending results... 8975 1727204052.96422: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' 8975 1727204052.96485: in run() - task 127b8e07-fff9-9356-306d-000000000261 8975 1727204052.96490: variable 'ansible_search_path' from source: unknown 8975 1727204052.96494: variable 'ansible_search_path' from source: unknown 8975 1727204052.96524: calling self._execute() 8975 1727204052.96611: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.96617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.96626: variable 'omit' from source: magic vars 8975 1727204052.96934: variable 'ansible_distribution_major_version' from source: facts 8975 1727204052.96942: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204052.96949: variable 'omit' from source: magic vars 8975 1727204052.96979: variable 'omit' from source: magic vars 8975 1727204052.97058: variable 'profile' from source: include params 8975 1727204052.97062: variable 'item' from source: include params 8975 1727204052.97112: variable 'item' from source: include params 8975 1727204052.97133: variable 'omit' from source: magic vars 8975 1727204052.97170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204052.97202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204052.97222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204052.97239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.97249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204052.97278: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204052.97281: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.97283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.97365: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204052.97370: Set connection var ansible_connection to ssh 8975 1727204052.97374: Set connection var ansible_shell_executable to /bin/sh 8975 1727204052.97380: Set connection var ansible_timeout to 10 8975 1727204052.97383: Set connection var ansible_shell_type to sh 8975 1727204052.97392: Set connection var ansible_pipelining to False 8975 1727204052.97413: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.97416: variable 'ansible_connection' from source: unknown 8975 1727204052.97418: variable 'ansible_module_compression' from source: unknown 8975 1727204052.97421: variable 'ansible_shell_type' from source: unknown 8975 1727204052.97423: variable 'ansible_shell_executable' from source: unknown 8975 1727204052.97425: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204052.97438: variable 'ansible_pipelining' from source: unknown 8975 1727204052.97443: variable 'ansible_timeout' from source: unknown 8975 1727204052.97445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204052.97551: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204052.97562: variable 'omit' from source: magic vars 8975 1727204052.97568: starting attempt loop 8975 1727204052.97571: running the handler 8975 1727204052.97658: variable 'lsr_net_profile_exists' from source: set_fact 8975 1727204052.97664: Evaluated conditional (lsr_net_profile_exists): True 8975 1727204052.97670: handler run complete 8975 1727204052.97684: attempt loop complete, returning result 8975 1727204052.97687: _execute() done 8975 1727204052.97690: dumping result to json 8975 1727204052.97692: done dumping result, returning 8975 1727204052.97699: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' [127b8e07-fff9-9356-306d-000000000261] 8975 1727204052.97705: sending task result for task 127b8e07-fff9-9356-306d-000000000261 8975 1727204052.97799: done sending task result for task 127b8e07-fff9-9356-306d-000000000261 8975 1727204052.97802: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204052.97859: no more pending results, returning what we have 8975 1727204052.97863: results queue empty 8975 1727204052.97864: checking for any_errors_fatal 8975 1727204052.97872: done checking for any_errors_fatal 8975 1727204052.97873: checking for max_fail_percentage 8975 1727204052.97875: done checking for max_fail_percentage 8975 1727204052.97876: checking to see if all hosts have failed and the running result is not ok 8975 1727204052.97877: done checking to see if all hosts have failed 8975 1727204052.97878: getting the remaining hosts for this loop 8975 1727204052.97880: done getting the remaining hosts for this loop 8975 1727204052.97884: getting the next task for host managed-node2 8975 1727204052.97892: done getting next task for host managed-node2 8975 1727204052.97894: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8975 1727204052.97897: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204052.97901: getting variables 8975 1727204052.97903: in VariableManager get_vars() 8975 1727204052.97953: Calling all_inventory to load vars for managed-node2 8975 1727204052.97956: Calling groups_inventory to load vars for managed-node2 8975 1727204052.97958: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204052.97974: Calling all_plugins_play to load vars for managed-node2 8975 1727204052.97977: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204052.97980: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.03322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.04479: done with get_vars() 8975 1727204053.04507: done getting variables 8975 1727204053.04549: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204053.04630: variable 'profile' from source: include params 8975 1727204053.04632: variable 'item' from source: include params 8975 1727204053.04677: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.091) 0:00:24.363 ***** 8975 1727204053.04705: entering _queue_task() for managed-node2/assert 8975 1727204053.04990: worker is 1 (out of 1 available) 8975 1727204053.05004: exiting _queue_task() for managed-node2/assert 8975 1727204053.05018: done queuing things up, now waiting for results queue to drain 8975 1727204053.05019: waiting for pending results... 8975 1727204053.05215: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' 8975 1727204053.05310: in run() - task 127b8e07-fff9-9356-306d-000000000262 8975 1727204053.05321: variable 'ansible_search_path' from source: unknown 8975 1727204053.05325: variable 'ansible_search_path' from source: unknown 8975 1727204053.05364: calling self._execute() 8975 1727204053.05446: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.05449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.05462: variable 'omit' from source: magic vars 8975 1727204053.05776: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.05788: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.05793: variable 'omit' from source: magic vars 8975 1727204053.05830: variable 'omit' from source: magic vars 8975 1727204053.05908: variable 'profile' from source: include params 8975 1727204053.05911: variable 'item' from source: include params 8975 1727204053.05967: variable 'item' from source: include params 8975 1727204053.05981: variable 'omit' from source: magic vars 8975 1727204053.06018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204053.06053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204053.06072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204053.06087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.06098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.06124: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204053.06128: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.06135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.06217: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204053.06220: Set connection var ansible_connection to ssh 8975 1727204053.06223: Set connection var ansible_shell_executable to /bin/sh 8975 1727204053.06229: Set connection var ansible_timeout to 10 8975 1727204053.06235: Set connection var ansible_shell_type to sh 8975 1727204053.06248: Set connection var ansible_pipelining to False 8975 1727204053.06270: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.06273: variable 'ansible_connection' from source: unknown 8975 1727204053.06276: variable 'ansible_module_compression' from source: unknown 8975 1727204053.06279: variable 'ansible_shell_type' from source: unknown 8975 1727204053.06282: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.06285: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.06287: variable 'ansible_pipelining' from source: unknown 8975 1727204053.06290: variable 'ansible_timeout' from source: unknown 8975 1727204053.06293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.06410: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204053.06419: variable 'omit' from source: magic vars 8975 1727204053.06424: starting attempt loop 8975 1727204053.06427: running the handler 8975 1727204053.06515: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8975 1727204053.06518: Evaluated conditional (lsr_net_profile_ansible_managed): True 8975 1727204053.06526: handler run complete 8975 1727204053.06541: attempt loop complete, returning result 8975 1727204053.06544: _execute() done 8975 1727204053.06546: dumping result to json 8975 1727204053.06549: done dumping result, returning 8975 1727204053.06556: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' [127b8e07-fff9-9356-306d-000000000262] 8975 1727204053.06563: sending task result for task 127b8e07-fff9-9356-306d-000000000262 8975 1727204053.06654: done sending task result for task 127b8e07-fff9-9356-306d-000000000262 8975 1727204053.06656: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204053.06735: no more pending results, returning what we have 8975 1727204053.06739: results queue empty 8975 1727204053.06740: checking for any_errors_fatal 8975 1727204053.06749: done checking for any_errors_fatal 8975 1727204053.06750: checking for max_fail_percentage 8975 1727204053.06751: done checking for max_fail_percentage 8975 1727204053.06753: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.06754: done checking to see if all hosts have failed 8975 1727204053.06754: getting the remaining hosts for this loop 8975 1727204053.06756: done getting the remaining hosts for this loop 8975 1727204053.06761: getting the next task for host managed-node2 8975 1727204053.06769: done getting next task for host managed-node2 8975 1727204053.06771: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8975 1727204053.06774: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.06778: getting variables 8975 1727204053.06779: in VariableManager get_vars() 8975 1727204053.06821: Calling all_inventory to load vars for managed-node2 8975 1727204053.06824: Calling groups_inventory to load vars for managed-node2 8975 1727204053.06826: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.06836: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.06839: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.06841: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.07828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.09117: done with get_vars() 8975 1727204053.09139: done getting variables 8975 1727204053.09192: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204053.09288: variable 'profile' from source: include params 8975 1727204053.09291: variable 'item' from source: include params 8975 1727204053.09335: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.046) 0:00:24.410 ***** 8975 1727204053.09363: entering _queue_task() for managed-node2/assert 8975 1727204053.09643: worker is 1 (out of 1 available) 8975 1727204053.09658: exiting _queue_task() for managed-node2/assert 8975 1727204053.09675: done queuing things up, now waiting for results queue to drain 8975 1727204053.09676: waiting for pending results... 8975 1727204053.09861: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 8975 1727204053.09941: in run() - task 127b8e07-fff9-9356-306d-000000000263 8975 1727204053.09953: variable 'ansible_search_path' from source: unknown 8975 1727204053.09957: variable 'ansible_search_path' from source: unknown 8975 1727204053.09990: calling self._execute() 8975 1727204053.10077: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.10084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.10093: variable 'omit' from source: magic vars 8975 1727204053.10399: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.10410: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.10417: variable 'omit' from source: magic vars 8975 1727204053.10453: variable 'omit' from source: magic vars 8975 1727204053.10526: variable 'profile' from source: include params 8975 1727204053.10530: variable 'item' from source: include params 8975 1727204053.10584: variable 'item' from source: include params 8975 1727204053.10600: variable 'omit' from source: magic vars 8975 1727204053.10638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204053.10672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204053.10691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204053.10706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.10717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.10746: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204053.10750: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.10753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.10831: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204053.10835: Set connection var ansible_connection to ssh 8975 1727204053.10841: Set connection var ansible_shell_executable to /bin/sh 8975 1727204053.10847: Set connection var ansible_timeout to 10 8975 1727204053.10850: Set connection var ansible_shell_type to sh 8975 1727204053.10859: Set connection var ansible_pipelining to False 8975 1727204053.10884: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.10887: variable 'ansible_connection' from source: unknown 8975 1727204053.10891: variable 'ansible_module_compression' from source: unknown 8975 1727204053.10894: variable 'ansible_shell_type' from source: unknown 8975 1727204053.10897: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.10900: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.10902: variable 'ansible_pipelining' from source: unknown 8975 1727204053.10905: variable 'ansible_timeout' from source: unknown 8975 1727204053.10908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.11018: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204053.11028: variable 'omit' from source: magic vars 8975 1727204053.11035: starting attempt loop 8975 1727204053.11038: running the handler 8975 1727204053.11120: variable 'lsr_net_profile_fingerprint' from source: set_fact 8975 1727204053.11125: Evaluated conditional (lsr_net_profile_fingerprint): True 8975 1727204053.11134: handler run complete 8975 1727204053.11151: attempt loop complete, returning result 8975 1727204053.11154: _execute() done 8975 1727204053.11158: dumping result to json 8975 1727204053.11160: done dumping result, returning 8975 1727204053.11167: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 [127b8e07-fff9-9356-306d-000000000263] 8975 1727204053.11174: sending task result for task 127b8e07-fff9-9356-306d-000000000263 8975 1727204053.11263: done sending task result for task 127b8e07-fff9-9356-306d-000000000263 8975 1727204053.11269: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204053.11321: no more pending results, returning what we have 8975 1727204053.11324: results queue empty 8975 1727204053.11325: checking for any_errors_fatal 8975 1727204053.11330: done checking for any_errors_fatal 8975 1727204053.11331: checking for max_fail_percentage 8975 1727204053.11333: done checking for max_fail_percentage 8975 1727204053.11334: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.11335: done checking to see if all hosts have failed 8975 1727204053.11336: getting the remaining hosts for this loop 8975 1727204053.11338: done getting the remaining hosts for this loop 8975 1727204053.11342: getting the next task for host managed-node2 8975 1727204053.11353: done getting next task for host managed-node2 8975 1727204053.11356: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8975 1727204053.11359: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.11364: getting variables 8975 1727204053.11367: in VariableManager get_vars() 8975 1727204053.11413: Calling all_inventory to load vars for managed-node2 8975 1727204053.11416: Calling groups_inventory to load vars for managed-node2 8975 1727204053.11418: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.11429: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.11432: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.11434: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.12444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.13621: done with get_vars() 8975 1727204053.13647: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.043) 0:00:24.454 ***** 8975 1727204053.13724: entering _queue_task() for managed-node2/include_tasks 8975 1727204053.14038: worker is 1 (out of 1 available) 8975 1727204053.14053: exiting _queue_task() for managed-node2/include_tasks 8975 1727204053.14268: done queuing things up, now waiting for results queue to drain 8975 1727204053.14271: waiting for pending results... 8975 1727204053.14486: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 8975 1727204053.14514: in run() - task 127b8e07-fff9-9356-306d-000000000267 8975 1727204053.14584: variable 'ansible_search_path' from source: unknown 8975 1727204053.14588: variable 'ansible_search_path' from source: unknown 8975 1727204053.14600: calling self._execute() 8975 1727204053.14718: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.14733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.14752: variable 'omit' from source: magic vars 8975 1727204053.15203: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.15239: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.15243: _execute() done 8975 1727204053.15348: dumping result to json 8975 1727204053.15352: done dumping result, returning 8975 1727204053.15354: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-9356-306d-000000000267] 8975 1727204053.15357: sending task result for task 127b8e07-fff9-9356-306d-000000000267 8975 1727204053.15444: done sending task result for task 127b8e07-fff9-9356-306d-000000000267 8975 1727204053.15571: WORKER PROCESS EXITING 8975 1727204053.15602: no more pending results, returning what we have 8975 1727204053.15608: in VariableManager get_vars() 8975 1727204053.15664: Calling all_inventory to load vars for managed-node2 8975 1727204053.15670: Calling groups_inventory to load vars for managed-node2 8975 1727204053.15672: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.15688: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.15692: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.15695: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.17723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.19538: done with get_vars() 8975 1727204053.19563: variable 'ansible_search_path' from source: unknown 8975 1727204053.19564: variable 'ansible_search_path' from source: unknown 8975 1727204053.19598: we have included files to process 8975 1727204053.19599: generating all_blocks data 8975 1727204053.19600: done generating all_blocks data 8975 1727204053.19605: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204053.19606: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204053.19608: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204053.20262: done processing included file 8975 1727204053.20264: iterating over new_blocks loaded from include file 8975 1727204053.20267: in VariableManager get_vars() 8975 1727204053.20285: done with get_vars() 8975 1727204053.20286: filtering new block on tags 8975 1727204053.20304: done filtering new block on tags 8975 1727204053.20306: in VariableManager get_vars() 8975 1727204053.20318: done with get_vars() 8975 1727204053.20319: filtering new block on tags 8975 1727204053.20335: done filtering new block on tags 8975 1727204053.20336: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 8975 1727204053.20340: extending task lists for all hosts with included blocks 8975 1727204053.20457: done extending task lists 8975 1727204053.20458: done processing included files 8975 1727204053.20459: results queue empty 8975 1727204053.20459: checking for any_errors_fatal 8975 1727204053.20461: done checking for any_errors_fatal 8975 1727204053.20462: checking for max_fail_percentage 8975 1727204053.20463: done checking for max_fail_percentage 8975 1727204053.20463: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.20464: done checking to see if all hosts have failed 8975 1727204053.20464: getting the remaining hosts for this loop 8975 1727204053.20467: done getting the remaining hosts for this loop 8975 1727204053.20469: getting the next task for host managed-node2 8975 1727204053.20472: done getting next task for host managed-node2 8975 1727204053.20473: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204053.20475: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.20477: getting variables 8975 1727204053.20479: in VariableManager get_vars() 8975 1727204053.20490: Calling all_inventory to load vars for managed-node2 8975 1727204053.20492: Calling groups_inventory to load vars for managed-node2 8975 1727204053.20493: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.20497: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.20499: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.20501: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.21439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.22612: done with get_vars() 8975 1727204053.22639: done getting variables 8975 1727204053.22680: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.089) 0:00:24.543 ***** 8975 1727204053.22704: entering _queue_task() for managed-node2/set_fact 8975 1727204053.22997: worker is 1 (out of 1 available) 8975 1727204053.23012: exiting _queue_task() for managed-node2/set_fact 8975 1727204053.23027: done queuing things up, now waiting for results queue to drain 8975 1727204053.23028: waiting for pending results... 8975 1727204053.23247: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204053.23339: in run() - task 127b8e07-fff9-9356-306d-0000000003fb 8975 1727204053.23351: variable 'ansible_search_path' from source: unknown 8975 1727204053.23356: variable 'ansible_search_path' from source: unknown 8975 1727204053.23392: calling self._execute() 8975 1727204053.23476: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.23483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.23493: variable 'omit' from source: magic vars 8975 1727204053.23801: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.23811: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.23816: variable 'omit' from source: magic vars 8975 1727204053.23859: variable 'omit' from source: magic vars 8975 1727204053.23889: variable 'omit' from source: magic vars 8975 1727204053.23930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204053.23966: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204053.23982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204053.23997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.24007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.24036: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204053.24041: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.24043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.24122: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204053.24126: Set connection var ansible_connection to ssh 8975 1727204053.24132: Set connection var ansible_shell_executable to /bin/sh 8975 1727204053.24135: Set connection var ansible_timeout to 10 8975 1727204053.24138: Set connection var ansible_shell_type to sh 8975 1727204053.24148: Set connection var ansible_pipelining to False 8975 1727204053.24168: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.24171: variable 'ansible_connection' from source: unknown 8975 1727204053.24175: variable 'ansible_module_compression' from source: unknown 8975 1727204053.24184: variable 'ansible_shell_type' from source: unknown 8975 1727204053.24187: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.24190: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.24192: variable 'ansible_pipelining' from source: unknown 8975 1727204053.24194: variable 'ansible_timeout' from source: unknown 8975 1727204053.24196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.24307: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204053.24319: variable 'omit' from source: magic vars 8975 1727204053.24324: starting attempt loop 8975 1727204053.24329: running the handler 8975 1727204053.24340: handler run complete 8975 1727204053.24348: attempt loop complete, returning result 8975 1727204053.24351: _execute() done 8975 1727204053.24354: dumping result to json 8975 1727204053.24358: done dumping result, returning 8975 1727204053.24363: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-9356-306d-0000000003fb] 8975 1727204053.24371: sending task result for task 127b8e07-fff9-9356-306d-0000000003fb 8975 1727204053.24461: done sending task result for task 127b8e07-fff9-9356-306d-0000000003fb 8975 1727204053.24464: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8975 1727204053.24532: no more pending results, returning what we have 8975 1727204053.24536: results queue empty 8975 1727204053.24537: checking for any_errors_fatal 8975 1727204053.24538: done checking for any_errors_fatal 8975 1727204053.24539: checking for max_fail_percentage 8975 1727204053.24541: done checking for max_fail_percentage 8975 1727204053.24542: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.24543: done checking to see if all hosts have failed 8975 1727204053.24544: getting the remaining hosts for this loop 8975 1727204053.24545: done getting the remaining hosts for this loop 8975 1727204053.24550: getting the next task for host managed-node2 8975 1727204053.24558: done getting next task for host managed-node2 8975 1727204053.24561: ^ task is: TASK: Stat profile file 8975 1727204053.24564: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.24571: getting variables 8975 1727204053.24572: in VariableManager get_vars() 8975 1727204053.24622: Calling all_inventory to load vars for managed-node2 8975 1727204053.24624: Calling groups_inventory to load vars for managed-node2 8975 1727204053.24626: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.24641: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.24644: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.24647: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.25663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.26867: done with get_vars() 8975 1727204053.26895: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.042) 0:00:24.586 ***** 8975 1727204053.26980: entering _queue_task() for managed-node2/stat 8975 1727204053.27273: worker is 1 (out of 1 available) 8975 1727204053.27288: exiting _queue_task() for managed-node2/stat 8975 1727204053.27301: done queuing things up, now waiting for results queue to drain 8975 1727204053.27302: waiting for pending results... 8975 1727204053.27491: running TaskExecutor() for managed-node2/TASK: Stat profile file 8975 1727204053.27585: in run() - task 127b8e07-fff9-9356-306d-0000000003fc 8975 1727204053.27597: variable 'ansible_search_path' from source: unknown 8975 1727204053.27601: variable 'ansible_search_path' from source: unknown 8975 1727204053.27636: calling self._execute() 8975 1727204053.27716: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.27720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.27733: variable 'omit' from source: magic vars 8975 1727204053.28037: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.28047: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.28054: variable 'omit' from source: magic vars 8975 1727204053.28095: variable 'omit' from source: magic vars 8975 1727204053.28164: variable 'profile' from source: include params 8975 1727204053.28171: variable 'item' from source: include params 8975 1727204053.28220: variable 'item' from source: include params 8975 1727204053.28236: variable 'omit' from source: magic vars 8975 1727204053.28276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204053.28313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204053.28329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204053.28343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.28352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.28380: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204053.28384: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.28387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.28468: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204053.28472: Set connection var ansible_connection to ssh 8975 1727204053.28477: Set connection var ansible_shell_executable to /bin/sh 8975 1727204053.28482: Set connection var ansible_timeout to 10 8975 1727204053.28486: Set connection var ansible_shell_type to sh 8975 1727204053.28495: Set connection var ansible_pipelining to False 8975 1727204053.28516: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.28520: variable 'ansible_connection' from source: unknown 8975 1727204053.28531: variable 'ansible_module_compression' from source: unknown 8975 1727204053.28534: variable 'ansible_shell_type' from source: unknown 8975 1727204053.28536: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.28538: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.28540: variable 'ansible_pipelining' from source: unknown 8975 1727204053.28543: variable 'ansible_timeout' from source: unknown 8975 1727204053.28545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.28705: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204053.28714: variable 'omit' from source: magic vars 8975 1727204053.28720: starting attempt loop 8975 1727204053.28724: running the handler 8975 1727204053.28738: _low_level_execute_command(): starting 8975 1727204053.28746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204053.29320: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.29326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.29332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.29372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.29377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.29393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.29473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.31160: stdout chunk (state=3): >>>/root <<< 8975 1727204053.31272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.31337: stderr chunk (state=3): >>><<< 8975 1727204053.31340: stdout chunk (state=3): >>><<< 8975 1727204053.31364: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.31377: _low_level_execute_command(): starting 8975 1727204053.31385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493 `" && echo ansible-tmp-1727204053.3136346-11289-267984195801493="` echo /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493 `" ) && sleep 0' 8975 1727204053.31863: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.31871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.31892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.31943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.31947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.32027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.33988: stdout chunk (state=3): >>>ansible-tmp-1727204053.3136346-11289-267984195801493=/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493 <<< 8975 1727204053.34088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.34155: stderr chunk (state=3): >>><<< 8975 1727204053.34158: stdout chunk (state=3): >>><<< 8975 1727204053.34178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204053.3136346-11289-267984195801493=/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.34226: variable 'ansible_module_compression' from source: unknown 8975 1727204053.34273: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204053.34309: variable 'ansible_facts' from source: unknown 8975 1727204053.34362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py 8975 1727204053.34471: Sending initial data 8975 1727204053.34482: Sent initial data (152 bytes) 8975 1727204053.34981: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.34985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.34987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204053.34989: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.34992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.35048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.35051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.35056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.35126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.36728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204053.36789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204053.36858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpd2hvh9ns /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py <<< 8975 1727204053.36868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py" <<< 8975 1727204053.36932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpd2hvh9ns" to remote "/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py" <<< 8975 1727204053.36936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py" <<< 8975 1727204053.37584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.37667: stderr chunk (state=3): >>><<< 8975 1727204053.37671: stdout chunk (state=3): >>><<< 8975 1727204053.37691: done transferring module to remote 8975 1727204053.37703: _low_level_execute_command(): starting 8975 1727204053.37708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/ /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py && sleep 0' 8975 1727204053.38203: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.38207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204053.38210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204053.38212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.38218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.38273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.38276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.38285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.38351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.40169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.40230: stderr chunk (state=3): >>><<< 8975 1727204053.40235: stdout chunk (state=3): >>><<< 8975 1727204053.40250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.40254: _low_level_execute_command(): starting 8975 1727204053.40259: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/AnsiballZ_stat.py && sleep 0' 8975 1727204053.40757: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.40761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.40764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204053.40773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.40821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.40824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.40829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.40912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.57418: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204053.58734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204053.58792: stderr chunk (state=3): >>><<< 8975 1727204053.58796: stdout chunk (state=3): >>><<< 8975 1727204053.58812: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204053.58842: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204053.58853: _low_level_execute_command(): starting 8975 1727204053.58858: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204053.3136346-11289-267984195801493/ > /dev/null 2>&1 && sleep 0' 8975 1727204053.59357: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.59361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.59364: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204053.59375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.59419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.59422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.59425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.59503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.61429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.61487: stderr chunk (state=3): >>><<< 8975 1727204053.61492: stdout chunk (state=3): >>><<< 8975 1727204053.61506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.61512: handler run complete 8975 1727204053.61536: attempt loop complete, returning result 8975 1727204053.61539: _execute() done 8975 1727204053.61542: dumping result to json 8975 1727204053.61544: done dumping result, returning 8975 1727204053.61553: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-9356-306d-0000000003fc] 8975 1727204053.61559: sending task result for task 127b8e07-fff9-9356-306d-0000000003fc 8975 1727204053.61670: done sending task result for task 127b8e07-fff9-9356-306d-0000000003fc 8975 1727204053.61673: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 8975 1727204053.61738: no more pending results, returning what we have 8975 1727204053.61742: results queue empty 8975 1727204053.61742: checking for any_errors_fatal 8975 1727204053.61751: done checking for any_errors_fatal 8975 1727204053.61752: checking for max_fail_percentage 8975 1727204053.61754: done checking for max_fail_percentage 8975 1727204053.61755: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.61756: done checking to see if all hosts have failed 8975 1727204053.61757: getting the remaining hosts for this loop 8975 1727204053.61759: done getting the remaining hosts for this loop 8975 1727204053.61763: getting the next task for host managed-node2 8975 1727204053.61772: done getting next task for host managed-node2 8975 1727204053.61775: ^ task is: TASK: Set NM profile exist flag based on the profile files 8975 1727204053.61778: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.61789: getting variables 8975 1727204053.61791: in VariableManager get_vars() 8975 1727204053.61835: Calling all_inventory to load vars for managed-node2 8975 1727204053.61838: Calling groups_inventory to load vars for managed-node2 8975 1727204053.61840: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.61852: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.61855: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.61857: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.63012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.64215: done with get_vars() 8975 1727204053.64245: done getting variables 8975 1727204053.64300: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.373) 0:00:24.960 ***** 8975 1727204053.64325: entering _queue_task() for managed-node2/set_fact 8975 1727204053.64620: worker is 1 (out of 1 available) 8975 1727204053.64643: exiting _queue_task() for managed-node2/set_fact 8975 1727204053.64668: done queuing things up, now waiting for results queue to drain 8975 1727204053.64673: waiting for pending results... 8975 1727204053.65011: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 8975 1727204053.65103: in run() - task 127b8e07-fff9-9356-306d-0000000003fd 8975 1727204053.65117: variable 'ansible_search_path' from source: unknown 8975 1727204053.65121: variable 'ansible_search_path' from source: unknown 8975 1727204053.65157: calling self._execute() 8975 1727204053.65244: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.65248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.65258: variable 'omit' from source: magic vars 8975 1727204053.65573: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.65584: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.65679: variable 'profile_stat' from source: set_fact 8975 1727204053.65691: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204053.65696: when evaluation is False, skipping this task 8975 1727204053.65699: _execute() done 8975 1727204053.65702: dumping result to json 8975 1727204053.65705: done dumping result, returning 8975 1727204053.65711: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-9356-306d-0000000003fd] 8975 1727204053.65717: sending task result for task 127b8e07-fff9-9356-306d-0000000003fd 8975 1727204053.65812: done sending task result for task 127b8e07-fff9-9356-306d-0000000003fd 8975 1727204053.65817: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204053.65872: no more pending results, returning what we have 8975 1727204053.65875: results queue empty 8975 1727204053.65876: checking for any_errors_fatal 8975 1727204053.65887: done checking for any_errors_fatal 8975 1727204053.65888: checking for max_fail_percentage 8975 1727204053.65889: done checking for max_fail_percentage 8975 1727204053.65890: checking to see if all hosts have failed and the running result is not ok 8975 1727204053.65891: done checking to see if all hosts have failed 8975 1727204053.65892: getting the remaining hosts for this loop 8975 1727204053.65895: done getting the remaining hosts for this loop 8975 1727204053.65899: getting the next task for host managed-node2 8975 1727204053.65907: done getting next task for host managed-node2 8975 1727204053.65910: ^ task is: TASK: Get NM profile info 8975 1727204053.65914: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204053.65919: getting variables 8975 1727204053.65920: in VariableManager get_vars() 8975 1727204053.65972: Calling all_inventory to load vars for managed-node2 8975 1727204053.65974: Calling groups_inventory to load vars for managed-node2 8975 1727204053.65976: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204053.65988: Calling all_plugins_play to load vars for managed-node2 8975 1727204053.65990: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204053.65993: Calling groups_plugins_play to load vars for managed-node2 8975 1727204053.67176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204053.69310: done with get_vars() 8975 1727204053.69353: done getting variables 8975 1727204053.69426: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.051) 0:00:25.011 ***** 8975 1727204053.69461: entering _queue_task() for managed-node2/shell 8975 1727204053.69841: worker is 1 (out of 1 available) 8975 1727204053.69854: exiting _queue_task() for managed-node2/shell 8975 1727204053.70071: done queuing things up, now waiting for results queue to drain 8975 1727204053.70074: waiting for pending results... 8975 1727204053.70286: running TaskExecutor() for managed-node2/TASK: Get NM profile info 8975 1727204053.70306: in run() - task 127b8e07-fff9-9356-306d-0000000003fe 8975 1727204053.70326: variable 'ansible_search_path' from source: unknown 8975 1727204053.70334: variable 'ansible_search_path' from source: unknown 8975 1727204053.70382: calling self._execute() 8975 1727204053.70515: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.70518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.70521: variable 'omit' from source: magic vars 8975 1727204053.70916: variable 'ansible_distribution_major_version' from source: facts 8975 1727204053.70935: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204053.71057: variable 'omit' from source: magic vars 8975 1727204053.71061: variable 'omit' from source: magic vars 8975 1727204053.71130: variable 'profile' from source: include params 8975 1727204053.71141: variable 'item' from source: include params 8975 1727204053.71219: variable 'item' from source: include params 8975 1727204053.71246: variable 'omit' from source: magic vars 8975 1727204053.71302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204053.71347: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204053.71379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204053.71405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.71421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204053.71459: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204053.71494: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.71497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.71598: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204053.71612: Set connection var ansible_connection to ssh 8975 1727204053.71671: Set connection var ansible_shell_executable to /bin/sh 8975 1727204053.71675: Set connection var ansible_timeout to 10 8975 1727204053.71678: Set connection var ansible_shell_type to sh 8975 1727204053.71681: Set connection var ansible_pipelining to False 8975 1727204053.71686: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.71694: variable 'ansible_connection' from source: unknown 8975 1727204053.71701: variable 'ansible_module_compression' from source: unknown 8975 1727204053.71712: variable 'ansible_shell_type' from source: unknown 8975 1727204053.71720: variable 'ansible_shell_executable' from source: unknown 8975 1727204053.71727: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204053.71735: variable 'ansible_pipelining' from source: unknown 8975 1727204053.71743: variable 'ansible_timeout' from source: unknown 8975 1727204053.71805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204053.71913: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204053.71919: variable 'omit' from source: magic vars 8975 1727204053.71926: starting attempt loop 8975 1727204053.71931: running the handler 8975 1727204053.71938: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204053.71956: _low_level_execute_command(): starting 8975 1727204053.71963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204053.72532: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.72536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204053.72540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.72598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.72601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.72683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.74346: stdout chunk (state=3): >>>/root <<< 8975 1727204053.74561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.74564: stdout chunk (state=3): >>><<< 8975 1727204053.74569: stderr chunk (state=3): >>><<< 8975 1727204053.74592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.74676: _low_level_execute_command(): starting 8975 1727204053.74682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559 `" && echo ansible-tmp-1727204053.7459986-11298-200121046664559="` echo /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559 `" ) && sleep 0' 8975 1727204053.75326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204053.75361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.75377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204053.75397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204053.75472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.75527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.75545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.75587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.75691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.77688: stdout chunk (state=3): >>>ansible-tmp-1727204053.7459986-11298-200121046664559=/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559 <<< 8975 1727204053.77910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.77914: stdout chunk (state=3): >>><<< 8975 1727204053.77917: stderr chunk (state=3): >>><<< 8975 1727204053.78072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204053.7459986-11298-200121046664559=/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.78076: variable 'ansible_module_compression' from source: unknown 8975 1727204053.78079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204053.78104: variable 'ansible_facts' from source: unknown 8975 1727204053.78197: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py 8975 1727204053.78443: Sending initial data 8975 1727204053.78446: Sent initial data (155 bytes) 8975 1727204053.79192: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204053.79230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.79248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.79262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.79376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.80993: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204053.81056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204053.81127: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp1wqlmhjc /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py <<< 8975 1727204053.81131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py" <<< 8975 1727204053.81207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp1wqlmhjc" to remote "/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py" <<< 8975 1727204053.82097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.82174: stderr chunk (state=3): >>><<< 8975 1727204053.82183: stdout chunk (state=3): >>><<< 8975 1727204053.82225: done transferring module to remote 8975 1727204053.82244: _low_level_execute_command(): starting 8975 1727204053.82253: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/ /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py && sleep 0' 8975 1727204053.82953: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204053.83088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.83092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.83121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.83228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204053.85161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204053.85183: stderr chunk (state=3): >>><<< 8975 1727204053.85201: stdout chunk (state=3): >>><<< 8975 1727204053.85230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204053.85240: _low_level_execute_command(): starting 8975 1727204053.85336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/AnsiballZ_command.py && sleep 0' 8975 1727204053.86001: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204053.86024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204053.86152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204053.86177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204053.86195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204053.86325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204054.04857: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:14.025172", "end": "2024-09-24 14:54:14.047223", "delta": "0:00:00.022051", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204054.06490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204054.06580: stderr chunk (state=3): >>><<< 8975 1727204054.06590: stdout chunk (state=3): >>><<< 8975 1727204054.06750: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:14.025172", "end": "2024-09-24 14:54:14.047223", "delta": "0:00:00.022051", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204054.06755: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204054.06758: _low_level_execute_command(): starting 8975 1727204054.06761: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204053.7459986-11298-200121046664559/ > /dev/null 2>&1 && sleep 0' 8975 1727204054.07384: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204054.07447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204054.07471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204054.07494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204054.07605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204054.09597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204054.09629: stdout chunk (state=3): >>><<< 8975 1727204054.09633: stderr chunk (state=3): >>><<< 8975 1727204054.09650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204054.09662: handler run complete 8975 1727204054.09693: Evaluated conditional (False): False 8975 1727204054.09771: attempt loop complete, returning result 8975 1727204054.09774: _execute() done 8975 1727204054.09777: dumping result to json 8975 1727204054.09779: done dumping result, returning 8975 1727204054.09781: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-9356-306d-0000000003fe] 8975 1727204054.09783: sending task result for task 127b8e07-fff9-9356-306d-0000000003fe ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022051", "end": "2024-09-24 14:54:14.047223", "rc": 0, "start": "2024-09-24 14:54:14.025172" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 8975 1727204054.10145: no more pending results, returning what we have 8975 1727204054.10149: results queue empty 8975 1727204054.10149: checking for any_errors_fatal 8975 1727204054.10154: done checking for any_errors_fatal 8975 1727204054.10155: checking for max_fail_percentage 8975 1727204054.10157: done checking for max_fail_percentage 8975 1727204054.10158: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.10159: done checking to see if all hosts have failed 8975 1727204054.10159: getting the remaining hosts for this loop 8975 1727204054.10161: done getting the remaining hosts for this loop 8975 1727204054.10168: getting the next task for host managed-node2 8975 1727204054.10175: done getting next task for host managed-node2 8975 1727204054.10178: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204054.10181: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.10186: getting variables 8975 1727204054.10188: in VariableManager get_vars() 8975 1727204054.10232: Calling all_inventory to load vars for managed-node2 8975 1727204054.10235: Calling groups_inventory to load vars for managed-node2 8975 1727204054.10237: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.10250: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.10253: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.10256: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.10807: done sending task result for task 127b8e07-fff9-9356-306d-0000000003fe 8975 1727204054.10813: WORKER PROCESS EXITING 8975 1727204054.12148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.14425: done with get_vars() 8975 1727204054.14467: done getting variables 8975 1727204054.14541: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.451) 0:00:25.462 ***** 8975 1727204054.14582: entering _queue_task() for managed-node2/set_fact 8975 1727204054.15015: worker is 1 (out of 1 available) 8975 1727204054.15029: exiting _queue_task() for managed-node2/set_fact 8975 1727204054.15045: done queuing things up, now waiting for results queue to drain 8975 1727204054.15160: waiting for pending results... 8975 1727204054.15351: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204054.15502: in run() - task 127b8e07-fff9-9356-306d-0000000003ff 8975 1727204054.15523: variable 'ansible_search_path' from source: unknown 8975 1727204054.15531: variable 'ansible_search_path' from source: unknown 8975 1727204054.15575: calling self._execute() 8975 1727204054.15693: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.15716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.15816: variable 'omit' from source: magic vars 8975 1727204054.16142: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.16165: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.16318: variable 'nm_profile_exists' from source: set_fact 8975 1727204054.16340: Evaluated conditional (nm_profile_exists.rc == 0): True 8975 1727204054.16357: variable 'omit' from source: magic vars 8975 1727204054.16424: variable 'omit' from source: magic vars 8975 1727204054.16471: variable 'omit' from source: magic vars 8975 1727204054.16526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204054.16587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204054.16670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204054.16673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.16680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.16687: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204054.16699: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.16708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.16833: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204054.16841: Set connection var ansible_connection to ssh 8975 1727204054.16853: Set connection var ansible_shell_executable to /bin/sh 8975 1727204054.16863: Set connection var ansible_timeout to 10 8975 1727204054.16872: Set connection var ansible_shell_type to sh 8975 1727204054.16891: Set connection var ansible_pipelining to False 8975 1727204054.16929: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.16938: variable 'ansible_connection' from source: unknown 8975 1727204054.16945: variable 'ansible_module_compression' from source: unknown 8975 1727204054.17009: variable 'ansible_shell_type' from source: unknown 8975 1727204054.17012: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.17016: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.17018: variable 'ansible_pipelining' from source: unknown 8975 1727204054.17021: variable 'ansible_timeout' from source: unknown 8975 1727204054.17023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.17160: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204054.17182: variable 'omit' from source: magic vars 8975 1727204054.17192: starting attempt loop 8975 1727204054.17226: running the handler 8975 1727204054.17229: handler run complete 8975 1727204054.17236: attempt loop complete, returning result 8975 1727204054.17248: _execute() done 8975 1727204054.17255: dumping result to json 8975 1727204054.17263: done dumping result, returning 8975 1727204054.17335: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-9356-306d-0000000003ff] 8975 1727204054.17339: sending task result for task 127b8e07-fff9-9356-306d-0000000003ff 8975 1727204054.17413: done sending task result for task 127b8e07-fff9-9356-306d-0000000003ff 8975 1727204054.17417: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8975 1727204054.17490: no more pending results, returning what we have 8975 1727204054.17493: results queue empty 8975 1727204054.17494: checking for any_errors_fatal 8975 1727204054.17504: done checking for any_errors_fatal 8975 1727204054.17505: checking for max_fail_percentage 8975 1727204054.17507: done checking for max_fail_percentage 8975 1727204054.17508: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.17510: done checking to see if all hosts have failed 8975 1727204054.17510: getting the remaining hosts for this loop 8975 1727204054.17513: done getting the remaining hosts for this loop 8975 1727204054.17517: getting the next task for host managed-node2 8975 1727204054.17531: done getting next task for host managed-node2 8975 1727204054.17534: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204054.17539: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.17544: getting variables 8975 1727204054.17661: in VariableManager get_vars() 8975 1727204054.17709: Calling all_inventory to load vars for managed-node2 8975 1727204054.17712: Calling groups_inventory to load vars for managed-node2 8975 1727204054.17714: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.17725: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.17728: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.17731: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.20541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.25447: done with get_vars() 8975 1727204054.25698: done getting variables 8975 1727204054.25799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.25935: variable 'profile' from source: include params 8975 1727204054.25940: variable 'item' from source: include params 8975 1727204054.26009: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.114) 0:00:25.577 ***** 8975 1727204054.26049: entering _queue_task() for managed-node2/command 8975 1727204054.26432: worker is 1 (out of 1 available) 8975 1727204054.26446: exiting _queue_task() for managed-node2/command 8975 1727204054.26459: done queuing things up, now waiting for results queue to drain 8975 1727204054.26461: waiting for pending results... 8975 1727204054.26888: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 8975 1727204054.26928: in run() - task 127b8e07-fff9-9356-306d-000000000401 8975 1727204054.26949: variable 'ansible_search_path' from source: unknown 8975 1727204054.26958: variable 'ansible_search_path' from source: unknown 8975 1727204054.27007: calling self._execute() 8975 1727204054.27348: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.27353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.27356: variable 'omit' from source: magic vars 8975 1727204054.27554: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.27575: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.27720: variable 'profile_stat' from source: set_fact 8975 1727204054.27741: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204054.27748: when evaluation is False, skipping this task 8975 1727204054.27756: _execute() done 8975 1727204054.27769: dumping result to json 8975 1727204054.27780: done dumping result, returning 8975 1727204054.27793: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [127b8e07-fff9-9356-306d-000000000401] 8975 1727204054.27804: sending task result for task 127b8e07-fff9-9356-306d-000000000401 8975 1727204054.28072: done sending task result for task 127b8e07-fff9-9356-306d-000000000401 8975 1727204054.28076: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204054.28135: no more pending results, returning what we have 8975 1727204054.28140: results queue empty 8975 1727204054.28141: checking for any_errors_fatal 8975 1727204054.28148: done checking for any_errors_fatal 8975 1727204054.28149: checking for max_fail_percentage 8975 1727204054.28151: done checking for max_fail_percentage 8975 1727204054.28152: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.28154: done checking to see if all hosts have failed 8975 1727204054.28154: getting the remaining hosts for this loop 8975 1727204054.28157: done getting the remaining hosts for this loop 8975 1727204054.28162: getting the next task for host managed-node2 8975 1727204054.28175: done getting next task for host managed-node2 8975 1727204054.28179: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204054.28184: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.28189: getting variables 8975 1727204054.28191: in VariableManager get_vars() 8975 1727204054.28243: Calling all_inventory to load vars for managed-node2 8975 1727204054.28246: Calling groups_inventory to load vars for managed-node2 8975 1727204054.28249: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.28458: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.28464: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.28470: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.30256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.33607: done with get_vars() 8975 1727204054.33649: done getting variables 8975 1727204054.33921: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.34043: variable 'profile' from source: include params 8975 1727204054.34047: variable 'item' from source: include params 8975 1727204054.34315: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.082) 0:00:25.660 ***** 8975 1727204054.34351: entering _queue_task() for managed-node2/set_fact 8975 1727204054.35136: worker is 1 (out of 1 available) 8975 1727204054.35150: exiting _queue_task() for managed-node2/set_fact 8975 1727204054.35168: done queuing things up, now waiting for results queue to drain 8975 1727204054.35169: waiting for pending results... 8975 1727204054.35871: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 8975 1727204054.36011: in run() - task 127b8e07-fff9-9356-306d-000000000402 8975 1727204054.36036: variable 'ansible_search_path' from source: unknown 8975 1727204054.36045: variable 'ansible_search_path' from source: unknown 8975 1727204054.36104: calling self._execute() 8975 1727204054.36209: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.36236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.36343: variable 'omit' from source: magic vars 8975 1727204054.36638: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.36657: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.36798: variable 'profile_stat' from source: set_fact 8975 1727204054.36819: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204054.36827: when evaluation is False, skipping this task 8975 1727204054.36835: _execute() done 8975 1727204054.36843: dumping result to json 8975 1727204054.36850: done dumping result, returning 8975 1727204054.36862: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [127b8e07-fff9-9356-306d-000000000402] 8975 1727204054.36876: sending task result for task 127b8e07-fff9-9356-306d-000000000402 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204054.37049: no more pending results, returning what we have 8975 1727204054.37054: results queue empty 8975 1727204054.37055: checking for any_errors_fatal 8975 1727204054.37063: done checking for any_errors_fatal 8975 1727204054.37063: checking for max_fail_percentage 8975 1727204054.37067: done checking for max_fail_percentage 8975 1727204054.37068: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.37070: done checking to see if all hosts have failed 8975 1727204054.37071: getting the remaining hosts for this loop 8975 1727204054.37073: done getting the remaining hosts for this loop 8975 1727204054.37078: getting the next task for host managed-node2 8975 1727204054.37087: done getting next task for host managed-node2 8975 1727204054.37090: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8975 1727204054.37094: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.37099: getting variables 8975 1727204054.37101: in VariableManager get_vars() 8975 1727204054.37152: Calling all_inventory to load vars for managed-node2 8975 1727204054.37156: Calling groups_inventory to load vars for managed-node2 8975 1727204054.37158: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.37475: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.37480: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.37485: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.38374: done sending task result for task 127b8e07-fff9-9356-306d-000000000402 8975 1727204054.38378: WORKER PROCESS EXITING 8975 1727204054.40724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.45128: done with get_vars() 8975 1727204054.45375: done getting variables 8975 1727204054.45444: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.45674: variable 'profile' from source: include params 8975 1727204054.45679: variable 'item' from source: include params 8975 1727204054.45752: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.114) 0:00:25.774 ***** 8975 1727204054.45793: entering _queue_task() for managed-node2/command 8975 1727204054.46598: worker is 1 (out of 1 available) 8975 1727204054.46613: exiting _queue_task() for managed-node2/command 8975 1727204054.46629: done queuing things up, now waiting for results queue to drain 8975 1727204054.46630: waiting for pending results... 8975 1727204054.47206: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 8975 1727204054.47473: in run() - task 127b8e07-fff9-9356-306d-000000000403 8975 1727204054.47595: variable 'ansible_search_path' from source: unknown 8975 1727204054.47606: variable 'ansible_search_path' from source: unknown 8975 1727204054.47656: calling self._execute() 8975 1727204054.47925: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.47942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.47959: variable 'omit' from source: magic vars 8975 1727204054.48837: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.48922: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.49395: variable 'profile_stat' from source: set_fact 8975 1727204054.49399: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204054.49403: when evaluation is False, skipping this task 8975 1727204054.49406: _execute() done 8975 1727204054.49408: dumping result to json 8975 1727204054.49410: done dumping result, returning 8975 1727204054.49413: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [127b8e07-fff9-9356-306d-000000000403] 8975 1727204054.49418: sending task result for task 127b8e07-fff9-9356-306d-000000000403 8975 1727204054.49572: done sending task result for task 127b8e07-fff9-9356-306d-000000000403 8975 1727204054.49576: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204054.49652: no more pending results, returning what we have 8975 1727204054.49657: results queue empty 8975 1727204054.49658: checking for any_errors_fatal 8975 1727204054.49668: done checking for any_errors_fatal 8975 1727204054.49669: checking for max_fail_percentage 8975 1727204054.49671: done checking for max_fail_percentage 8975 1727204054.49672: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.49674: done checking to see if all hosts have failed 8975 1727204054.49675: getting the remaining hosts for this loop 8975 1727204054.49678: done getting the remaining hosts for this loop 8975 1727204054.49682: getting the next task for host managed-node2 8975 1727204054.49693: done getting next task for host managed-node2 8975 1727204054.49696: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8975 1727204054.49702: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.49708: getting variables 8975 1727204054.49710: in VariableManager get_vars() 8975 1727204054.49952: Calling all_inventory to load vars for managed-node2 8975 1727204054.49956: Calling groups_inventory to load vars for managed-node2 8975 1727204054.49958: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.49974: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.49977: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.49980: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.52039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.54977: done with get_vars() 8975 1727204054.55013: done getting variables 8975 1727204054.55086: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.55211: variable 'profile' from source: include params 8975 1727204054.55215: variable 'item' from source: include params 8975 1727204054.55281: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.095) 0:00:25.870 ***** 8975 1727204054.55313: entering _queue_task() for managed-node2/set_fact 8975 1727204054.55947: worker is 1 (out of 1 available) 8975 1727204054.55961: exiting _queue_task() for managed-node2/set_fact 8975 1727204054.56015: done queuing things up, now waiting for results queue to drain 8975 1727204054.56017: waiting for pending results... 8975 1727204054.56254: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 8975 1727204054.56447: in run() - task 127b8e07-fff9-9356-306d-000000000404 8975 1727204054.56473: variable 'ansible_search_path' from source: unknown 8975 1727204054.56476: variable 'ansible_search_path' from source: unknown 8975 1727204054.56555: calling self._execute() 8975 1727204054.56676: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.56680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.56683: variable 'omit' from source: magic vars 8975 1727204054.57316: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.57351: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.57504: variable 'profile_stat' from source: set_fact 8975 1727204054.57526: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204054.57546: when evaluation is False, skipping this task 8975 1727204054.57549: _execute() done 8975 1727204054.57656: dumping result to json 8975 1727204054.57659: done dumping result, returning 8975 1727204054.57662: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [127b8e07-fff9-9356-306d-000000000404] 8975 1727204054.57666: sending task result for task 127b8e07-fff9-9356-306d-000000000404 8975 1727204054.57746: done sending task result for task 127b8e07-fff9-9356-306d-000000000404 8975 1727204054.57750: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204054.57811: no more pending results, returning what we have 8975 1727204054.57816: results queue empty 8975 1727204054.57817: checking for any_errors_fatal 8975 1727204054.57824: done checking for any_errors_fatal 8975 1727204054.57825: checking for max_fail_percentage 8975 1727204054.57827: done checking for max_fail_percentage 8975 1727204054.57828: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.57829: done checking to see if all hosts have failed 8975 1727204054.57830: getting the remaining hosts for this loop 8975 1727204054.57832: done getting the remaining hosts for this loop 8975 1727204054.57837: getting the next task for host managed-node2 8975 1727204054.57848: done getting next task for host managed-node2 8975 1727204054.57850: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8975 1727204054.57854: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.57859: getting variables 8975 1727204054.57861: in VariableManager get_vars() 8975 1727204054.57913: Calling all_inventory to load vars for managed-node2 8975 1727204054.57917: Calling groups_inventory to load vars for managed-node2 8975 1727204054.57919: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.57935: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.57938: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.57942: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.60389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.64709: done with get_vars() 8975 1727204054.64752: done getting variables 8975 1727204054.64828: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.64959: variable 'profile' from source: include params 8975 1727204054.64964: variable 'item' from source: include params 8975 1727204054.65029: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.097) 0:00:25.967 ***** 8975 1727204054.65062: entering _queue_task() for managed-node2/assert 8975 1727204054.65436: worker is 1 (out of 1 available) 8975 1727204054.65453: exiting _queue_task() for managed-node2/assert 8975 1727204054.65470: done queuing things up, now waiting for results queue to drain 8975 1727204054.65472: waiting for pending results... 8975 1727204054.65854: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' 8975 1727204054.65946: in run() - task 127b8e07-fff9-9356-306d-000000000268 8975 1727204054.66004: variable 'ansible_search_path' from source: unknown 8975 1727204054.66008: variable 'ansible_search_path' from source: unknown 8975 1727204054.66042: calling self._execute() 8975 1727204054.66234: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.66238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.66241: variable 'omit' from source: magic vars 8975 1727204054.66883: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.66888: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.66892: variable 'omit' from source: magic vars 8975 1727204054.66958: variable 'omit' from source: magic vars 8975 1727204054.67375: variable 'profile' from source: include params 8975 1727204054.67379: variable 'item' from source: include params 8975 1727204054.67480: variable 'item' from source: include params 8975 1727204054.67509: variable 'omit' from source: magic vars 8975 1727204054.67706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204054.67814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204054.67818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204054.67835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.67888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.68056: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204054.68064: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.68075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.68474: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204054.68477: Set connection var ansible_connection to ssh 8975 1727204054.68481: Set connection var ansible_shell_executable to /bin/sh 8975 1727204054.68585: Set connection var ansible_timeout to 10 8975 1727204054.68588: Set connection var ansible_shell_type to sh 8975 1727204054.68592: Set connection var ansible_pipelining to False 8975 1727204054.68594: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.68596: variable 'ansible_connection' from source: unknown 8975 1727204054.68599: variable 'ansible_module_compression' from source: unknown 8975 1727204054.68601: variable 'ansible_shell_type' from source: unknown 8975 1727204054.68603: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.68605: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.68607: variable 'ansible_pipelining' from source: unknown 8975 1727204054.68610: variable 'ansible_timeout' from source: unknown 8975 1727204054.68612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.68717: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204054.68742: variable 'omit' from source: magic vars 8975 1727204054.68745: starting attempt loop 8975 1727204054.68748: running the handler 8975 1727204054.68925: variable 'lsr_net_profile_exists' from source: set_fact 8975 1727204054.68932: Evaluated conditional (lsr_net_profile_exists): True 8975 1727204054.68934: handler run complete 8975 1727204054.68936: attempt loop complete, returning result 8975 1727204054.68938: _execute() done 8975 1727204054.68940: dumping result to json 8975 1727204054.68947: done dumping result, returning 8975 1727204054.68950: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' [127b8e07-fff9-9356-306d-000000000268] 8975 1727204054.68952: sending task result for task 127b8e07-fff9-9356-306d-000000000268 8975 1727204054.69108: done sending task result for task 127b8e07-fff9-9356-306d-000000000268 8975 1727204054.69111: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204054.69170: no more pending results, returning what we have 8975 1727204054.69175: results queue empty 8975 1727204054.69176: checking for any_errors_fatal 8975 1727204054.69184: done checking for any_errors_fatal 8975 1727204054.69185: checking for max_fail_percentage 8975 1727204054.69190: done checking for max_fail_percentage 8975 1727204054.69191: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.69192: done checking to see if all hosts have failed 8975 1727204054.69193: getting the remaining hosts for this loop 8975 1727204054.69196: done getting the remaining hosts for this loop 8975 1727204054.69200: getting the next task for host managed-node2 8975 1727204054.69208: done getting next task for host managed-node2 8975 1727204054.69211: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8975 1727204054.69214: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.69220: getting variables 8975 1727204054.69223: in VariableManager get_vars() 8975 1727204054.69476: Calling all_inventory to load vars for managed-node2 8975 1727204054.69480: Calling groups_inventory to load vars for managed-node2 8975 1727204054.69482: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.69496: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.69500: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.69504: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.71336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.73351: done with get_vars() 8975 1727204054.73397: done getting variables 8975 1727204054.73472: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.73615: variable 'profile' from source: include params 8975 1727204054.73620: variable 'item' from source: include params 8975 1727204054.73690: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.086) 0:00:26.054 ***** 8975 1727204054.73730: entering _queue_task() for managed-node2/assert 8975 1727204054.74121: worker is 1 (out of 1 available) 8975 1727204054.74136: exiting _queue_task() for managed-node2/assert 8975 1727204054.74151: done queuing things up, now waiting for results queue to drain 8975 1727204054.74153: waiting for pending results... 8975 1727204054.74592: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 8975 1727204054.74599: in run() - task 127b8e07-fff9-9356-306d-000000000269 8975 1727204054.74603: variable 'ansible_search_path' from source: unknown 8975 1727204054.74606: variable 'ansible_search_path' from source: unknown 8975 1727204054.74641: calling self._execute() 8975 1727204054.74743: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.74835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.74839: variable 'omit' from source: magic vars 8975 1727204054.75162: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.75172: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.75180: variable 'omit' from source: magic vars 8975 1727204054.75234: variable 'omit' from source: magic vars 8975 1727204054.75348: variable 'profile' from source: include params 8975 1727204054.75352: variable 'item' from source: include params 8975 1727204054.75422: variable 'item' from source: include params 8975 1727204054.75449: variable 'omit' from source: magic vars 8975 1727204054.75496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204054.75534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204054.75560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204054.75600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.75603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.75626: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204054.75632: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.75634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.75814: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204054.75818: Set connection var ansible_connection to ssh 8975 1727204054.75820: Set connection var ansible_shell_executable to /bin/sh 8975 1727204054.75823: Set connection var ansible_timeout to 10 8975 1727204054.75825: Set connection var ansible_shell_type to sh 8975 1727204054.75831: Set connection var ansible_pipelining to False 8975 1727204054.75833: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.75835: variable 'ansible_connection' from source: unknown 8975 1727204054.75838: variable 'ansible_module_compression' from source: unknown 8975 1727204054.75840: variable 'ansible_shell_type' from source: unknown 8975 1727204054.75842: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.75844: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.75846: variable 'ansible_pipelining' from source: unknown 8975 1727204054.75848: variable 'ansible_timeout' from source: unknown 8975 1727204054.75850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.76072: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204054.76076: variable 'omit' from source: magic vars 8975 1727204054.76078: starting attempt loop 8975 1727204054.76082: running the handler 8975 1727204054.76125: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8975 1727204054.76132: Evaluated conditional (lsr_net_profile_ansible_managed): True 8975 1727204054.76138: handler run complete 8975 1727204054.76154: attempt loop complete, returning result 8975 1727204054.76157: _execute() done 8975 1727204054.76160: dumping result to json 8975 1727204054.76163: done dumping result, returning 8975 1727204054.76197: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [127b8e07-fff9-9356-306d-000000000269] 8975 1727204054.76200: sending task result for task 127b8e07-fff9-9356-306d-000000000269 8975 1727204054.76389: done sending task result for task 127b8e07-fff9-9356-306d-000000000269 8975 1727204054.76393: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204054.76447: no more pending results, returning what we have 8975 1727204054.76450: results queue empty 8975 1727204054.76451: checking for any_errors_fatal 8975 1727204054.76458: done checking for any_errors_fatal 8975 1727204054.76458: checking for max_fail_percentage 8975 1727204054.76460: done checking for max_fail_percentage 8975 1727204054.76461: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.76462: done checking to see if all hosts have failed 8975 1727204054.76463: getting the remaining hosts for this loop 8975 1727204054.76466: done getting the remaining hosts for this loop 8975 1727204054.76471: getting the next task for host managed-node2 8975 1727204054.76477: done getting next task for host managed-node2 8975 1727204054.76480: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8975 1727204054.76483: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.76487: getting variables 8975 1727204054.76488: in VariableManager get_vars() 8975 1727204054.76534: Calling all_inventory to load vars for managed-node2 8975 1727204054.76537: Calling groups_inventory to load vars for managed-node2 8975 1727204054.76540: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.76552: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.76557: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.76560: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.78518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.80785: done with get_vars() 8975 1727204054.80830: done getting variables 8975 1727204054.80906: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204054.81038: variable 'profile' from source: include params 8975 1727204054.81042: variable 'item' from source: include params 8975 1727204054.81112: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.074) 0:00:26.128 ***** 8975 1727204054.81156: entering _queue_task() for managed-node2/assert 8975 1727204054.81558: worker is 1 (out of 1 available) 8975 1727204054.81780: exiting _queue_task() for managed-node2/assert 8975 1727204054.81794: done queuing things up, now waiting for results queue to drain 8975 1727204054.81795: waiting for pending results... 8975 1727204054.81931: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 8975 1727204054.82103: in run() - task 127b8e07-fff9-9356-306d-00000000026a 8975 1727204054.82107: variable 'ansible_search_path' from source: unknown 8975 1727204054.82110: variable 'ansible_search_path' from source: unknown 8975 1727204054.82155: calling self._execute() 8975 1727204054.82319: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.82323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.82326: variable 'omit' from source: magic vars 8975 1727204054.82733: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.82757: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.82770: variable 'omit' from source: magic vars 8975 1727204054.82823: variable 'omit' from source: magic vars 8975 1727204054.82973: variable 'profile' from source: include params 8975 1727204054.82978: variable 'item' from source: include params 8975 1727204054.83048: variable 'item' from source: include params 8975 1727204054.83106: variable 'omit' from source: magic vars 8975 1727204054.83138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204054.83189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204054.83222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204054.83300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.83303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204054.83307: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204054.83316: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.83332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.83455: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204054.83463: Set connection var ansible_connection to ssh 8975 1727204054.83475: Set connection var ansible_shell_executable to /bin/sh 8975 1727204054.83485: Set connection var ansible_timeout to 10 8975 1727204054.83492: Set connection var ansible_shell_type to sh 8975 1727204054.83508: Set connection var ansible_pipelining to False 8975 1727204054.83547: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.83625: variable 'ansible_connection' from source: unknown 8975 1727204054.83631: variable 'ansible_module_compression' from source: unknown 8975 1727204054.83634: variable 'ansible_shell_type' from source: unknown 8975 1727204054.83636: variable 'ansible_shell_executable' from source: unknown 8975 1727204054.83638: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.83640: variable 'ansible_pipelining' from source: unknown 8975 1727204054.83642: variable 'ansible_timeout' from source: unknown 8975 1727204054.83645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.83771: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204054.83845: variable 'omit' from source: magic vars 8975 1727204054.83848: starting attempt loop 8975 1727204054.83851: running the handler 8975 1727204054.83945: variable 'lsr_net_profile_fingerprint' from source: set_fact 8975 1727204054.83961: Evaluated conditional (lsr_net_profile_fingerprint): True 8975 1727204054.83976: handler run complete 8975 1727204054.83994: attempt loop complete, returning result 8975 1727204054.84001: _execute() done 8975 1727204054.84007: dumping result to json 8975 1727204054.84016: done dumping result, returning 8975 1727204054.84030: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 [127b8e07-fff9-9356-306d-00000000026a] 8975 1727204054.84041: sending task result for task 127b8e07-fff9-9356-306d-00000000026a ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204054.84321: no more pending results, returning what we have 8975 1727204054.84324: results queue empty 8975 1727204054.84325: checking for any_errors_fatal 8975 1727204054.84335: done checking for any_errors_fatal 8975 1727204054.84336: checking for max_fail_percentage 8975 1727204054.84338: done checking for max_fail_percentage 8975 1727204054.84339: checking to see if all hosts have failed and the running result is not ok 8975 1727204054.84340: done checking to see if all hosts have failed 8975 1727204054.84341: getting the remaining hosts for this loop 8975 1727204054.84343: done getting the remaining hosts for this loop 8975 1727204054.84347: getting the next task for host managed-node2 8975 1727204054.84358: done getting next task for host managed-node2 8975 1727204054.84362: ^ task is: TASK: Include the task 'get_profile_stat.yml' 8975 1727204054.84364: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204054.84371: getting variables 8975 1727204054.84372: in VariableManager get_vars() 8975 1727204054.84417: Calling all_inventory to load vars for managed-node2 8975 1727204054.84421: Calling groups_inventory to load vars for managed-node2 8975 1727204054.84423: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.84437: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.84440: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.84443: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.85085: done sending task result for task 127b8e07-fff9-9356-306d-00000000026a 8975 1727204054.85089: WORKER PROCESS EXITING 8975 1727204054.86705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204054.94915: done with get_vars() 8975 1727204054.94957: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.138) 0:00:26.267 ***** 8975 1727204054.95042: entering _queue_task() for managed-node2/include_tasks 8975 1727204054.95517: worker is 1 (out of 1 available) 8975 1727204054.95533: exiting _queue_task() for managed-node2/include_tasks 8975 1727204054.95546: done queuing things up, now waiting for results queue to drain 8975 1727204054.95548: waiting for pending results... 8975 1727204054.95806: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 8975 1727204054.95967: in run() - task 127b8e07-fff9-9356-306d-00000000026e 8975 1727204054.95996: variable 'ansible_search_path' from source: unknown 8975 1727204054.96006: variable 'ansible_search_path' from source: unknown 8975 1727204054.96056: calling self._execute() 8975 1727204054.96174: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204054.96188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204054.96206: variable 'omit' from source: magic vars 8975 1727204054.96671: variable 'ansible_distribution_major_version' from source: facts 8975 1727204054.96690: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204054.96763: _execute() done 8975 1727204054.96768: dumping result to json 8975 1727204054.96771: done dumping result, returning 8975 1727204054.96774: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-9356-306d-00000000026e] 8975 1727204054.96777: sending task result for task 127b8e07-fff9-9356-306d-00000000026e 8975 1727204054.96899: no more pending results, returning what we have 8975 1727204054.96905: in VariableManager get_vars() 8975 1727204054.96963: Calling all_inventory to load vars for managed-node2 8975 1727204054.96968: Calling groups_inventory to load vars for managed-node2 8975 1727204054.96971: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204054.96989: Calling all_plugins_play to load vars for managed-node2 8975 1727204054.96992: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204054.96996: Calling groups_plugins_play to load vars for managed-node2 8975 1727204054.97794: done sending task result for task 127b8e07-fff9-9356-306d-00000000026e 8975 1727204054.97798: WORKER PROCESS EXITING 8975 1727204054.99142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204055.01390: done with get_vars() 8975 1727204055.01431: variable 'ansible_search_path' from source: unknown 8975 1727204055.01433: variable 'ansible_search_path' from source: unknown 8975 1727204055.01480: we have included files to process 8975 1727204055.01482: generating all_blocks data 8975 1727204055.01484: done generating all_blocks data 8975 1727204055.01491: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204055.01492: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204055.01494: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 8975 1727204055.02555: done processing included file 8975 1727204055.02558: iterating over new_blocks loaded from include file 8975 1727204055.02560: in VariableManager get_vars() 8975 1727204055.02586: done with get_vars() 8975 1727204055.02588: filtering new block on tags 8975 1727204055.02622: done filtering new block on tags 8975 1727204055.02626: in VariableManager get_vars() 8975 1727204055.02650: done with get_vars() 8975 1727204055.02651: filtering new block on tags 8975 1727204055.02676: done filtering new block on tags 8975 1727204055.02678: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 8975 1727204055.02684: extending task lists for all hosts with included blocks 8975 1727204055.02894: done extending task lists 8975 1727204055.02896: done processing included files 8975 1727204055.02897: results queue empty 8975 1727204055.02898: checking for any_errors_fatal 8975 1727204055.02902: done checking for any_errors_fatal 8975 1727204055.02903: checking for max_fail_percentage 8975 1727204055.02904: done checking for max_fail_percentage 8975 1727204055.02905: checking to see if all hosts have failed and the running result is not ok 8975 1727204055.02906: done checking to see if all hosts have failed 8975 1727204055.02907: getting the remaining hosts for this loop 8975 1727204055.02908: done getting the remaining hosts for this loop 8975 1727204055.02911: getting the next task for host managed-node2 8975 1727204055.02915: done getting next task for host managed-node2 8975 1727204055.02917: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204055.02920: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204055.02922: getting variables 8975 1727204055.02923: in VariableManager get_vars() 8975 1727204055.02947: Calling all_inventory to load vars for managed-node2 8975 1727204055.02950: Calling groups_inventory to load vars for managed-node2 8975 1727204055.02952: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204055.02959: Calling all_plugins_play to load vars for managed-node2 8975 1727204055.02961: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204055.02964: Calling groups_plugins_play to load vars for managed-node2 8975 1727204055.04643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204055.06946: done with get_vars() 8975 1727204055.06990: done getting variables 8975 1727204055.07047: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.120) 0:00:26.387 ***** 8975 1727204055.07092: entering _queue_task() for managed-node2/set_fact 8975 1727204055.07502: worker is 1 (out of 1 available) 8975 1727204055.07516: exiting _queue_task() for managed-node2/set_fact 8975 1727204055.07533: done queuing things up, now waiting for results queue to drain 8975 1727204055.07534: waiting for pending results... 8975 1727204055.07846: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 8975 1727204055.07994: in run() - task 127b8e07-fff9-9356-306d-000000000443 8975 1727204055.08017: variable 'ansible_search_path' from source: unknown 8975 1727204055.08026: variable 'ansible_search_path' from source: unknown 8975 1727204055.08081: calling self._execute() 8975 1727204055.08211: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.08225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.08292: variable 'omit' from source: magic vars 8975 1727204055.08711: variable 'ansible_distribution_major_version' from source: facts 8975 1727204055.08740: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204055.08758: variable 'omit' from source: magic vars 8975 1727204055.08822: variable 'omit' from source: magic vars 8975 1727204055.08883: variable 'omit' from source: magic vars 8975 1727204055.08942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204055.08999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204055.09030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204055.09085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.09088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.09116: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204055.09126: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.09138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.09302: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204055.09306: Set connection var ansible_connection to ssh 8975 1727204055.09308: Set connection var ansible_shell_executable to /bin/sh 8975 1727204055.09312: Set connection var ansible_timeout to 10 8975 1727204055.09314: Set connection var ansible_shell_type to sh 8975 1727204055.09322: Set connection var ansible_pipelining to False 8975 1727204055.09355: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.09363: variable 'ansible_connection' from source: unknown 8975 1727204055.09375: variable 'ansible_module_compression' from source: unknown 8975 1727204055.09412: variable 'ansible_shell_type' from source: unknown 8975 1727204055.09415: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.09417: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.09420: variable 'ansible_pipelining' from source: unknown 8975 1727204055.09423: variable 'ansible_timeout' from source: unknown 8975 1727204055.09425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.09593: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204055.09632: variable 'omit' from source: magic vars 8975 1727204055.09635: starting attempt loop 8975 1727204055.09638: running the handler 8975 1727204055.09671: handler run complete 8975 1727204055.09675: attempt loop complete, returning result 8975 1727204055.09771: _execute() done 8975 1727204055.09774: dumping result to json 8975 1727204055.09777: done dumping result, returning 8975 1727204055.09779: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-9356-306d-000000000443] 8975 1727204055.09782: sending task result for task 127b8e07-fff9-9356-306d-000000000443 8975 1727204055.09863: done sending task result for task 127b8e07-fff9-9356-306d-000000000443 8975 1727204055.09868: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 8975 1727204055.09935: no more pending results, returning what we have 8975 1727204055.09939: results queue empty 8975 1727204055.09940: checking for any_errors_fatal 8975 1727204055.09942: done checking for any_errors_fatal 8975 1727204055.09943: checking for max_fail_percentage 8975 1727204055.09945: done checking for max_fail_percentage 8975 1727204055.09946: checking to see if all hosts have failed and the running result is not ok 8975 1727204055.09947: done checking to see if all hosts have failed 8975 1727204055.09948: getting the remaining hosts for this loop 8975 1727204055.09950: done getting the remaining hosts for this loop 8975 1727204055.09954: getting the next task for host managed-node2 8975 1727204055.09964: done getting next task for host managed-node2 8975 1727204055.09970: ^ task is: TASK: Stat profile file 8975 1727204055.09975: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204055.09980: getting variables 8975 1727204055.09982: in VariableManager get_vars() 8975 1727204055.10034: Calling all_inventory to load vars for managed-node2 8975 1727204055.10038: Calling groups_inventory to load vars for managed-node2 8975 1727204055.10040: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204055.10054: Calling all_plugins_play to load vars for managed-node2 8975 1727204055.10058: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204055.10061: Calling groups_plugins_play to load vars for managed-node2 8975 1727204055.12353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204055.14716: done with get_vars() 8975 1727204055.14762: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.077) 0:00:26.465 ***** 8975 1727204055.14882: entering _queue_task() for managed-node2/stat 8975 1727204055.15483: worker is 1 (out of 1 available) 8975 1727204055.15494: exiting _queue_task() for managed-node2/stat 8975 1727204055.15507: done queuing things up, now waiting for results queue to drain 8975 1727204055.15509: waiting for pending results... 8975 1727204055.15860: running TaskExecutor() for managed-node2/TASK: Stat profile file 8975 1727204055.15868: in run() - task 127b8e07-fff9-9356-306d-000000000444 8975 1727204055.15872: variable 'ansible_search_path' from source: unknown 8975 1727204055.15875: variable 'ansible_search_path' from source: unknown 8975 1727204055.15878: calling self._execute() 8975 1727204055.15994: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.16008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.16025: variable 'omit' from source: magic vars 8975 1727204055.16471: variable 'ansible_distribution_major_version' from source: facts 8975 1727204055.16490: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204055.16511: variable 'omit' from source: magic vars 8975 1727204055.16574: variable 'omit' from source: magic vars 8975 1727204055.16694: variable 'profile' from source: include params 8975 1727204055.16704: variable 'item' from source: include params 8975 1727204055.16791: variable 'item' from source: include params 8975 1727204055.16934: variable 'omit' from source: magic vars 8975 1727204055.16940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204055.16944: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204055.16971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204055.16998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.17016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.17069: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204055.17079: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.17087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.17219: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204055.17232: Set connection var ansible_connection to ssh 8975 1727204055.17263: Set connection var ansible_shell_executable to /bin/sh 8975 1727204055.17269: Set connection var ansible_timeout to 10 8975 1727204055.17376: Set connection var ansible_shell_type to sh 8975 1727204055.17380: Set connection var ansible_pipelining to False 8975 1727204055.17383: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.17385: variable 'ansible_connection' from source: unknown 8975 1727204055.17387: variable 'ansible_module_compression' from source: unknown 8975 1727204055.17389: variable 'ansible_shell_type' from source: unknown 8975 1727204055.17392: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.17394: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.17396: variable 'ansible_pipelining' from source: unknown 8975 1727204055.17399: variable 'ansible_timeout' from source: unknown 8975 1727204055.17401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.17634: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204055.17654: variable 'omit' from source: magic vars 8975 1727204055.17664: starting attempt loop 8975 1727204055.17700: running the handler 8975 1727204055.17704: _low_level_execute_command(): starting 8975 1727204055.17706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204055.18558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204055.18583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.18604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.18687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.18745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.18772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.18912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.20677: stdout chunk (state=3): >>>/root <<< 8975 1727204055.20791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.20862: stderr chunk (state=3): >>><<< 8975 1727204055.20868: stdout chunk (state=3): >>><<< 8975 1727204055.20881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.20893: _low_level_execute_command(): starting 8975 1727204055.20906: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328 `" && echo ansible-tmp-1727204055.2088208-11345-140688018238328="` echo /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328 `" ) && sleep 0' 8975 1727204055.21419: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.21424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204055.21441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.21486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.21490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.21591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.23582: stdout chunk (state=3): >>>ansible-tmp-1727204055.2088208-11345-140688018238328=/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328 <<< 8975 1727204055.23688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.23752: stderr chunk (state=3): >>><<< 8975 1727204055.23756: stdout chunk (state=3): >>><<< 8975 1727204055.23774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204055.2088208-11345-140688018238328=/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.23817: variable 'ansible_module_compression' from source: unknown 8975 1727204055.23864: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8975 1727204055.23901: variable 'ansible_facts' from source: unknown 8975 1727204055.23963: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py 8975 1727204055.24081: Sending initial data 8975 1727204055.24085: Sent initial data (152 bytes) 8975 1727204055.24556: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.24590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204055.24594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204055.24597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.24599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.24602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.24656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.24659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.24734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.26334: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204055.26406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204055.26501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp8zwr2ph6 /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py <<< 8975 1727204055.26505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py" <<< 8975 1727204055.26567: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp8zwr2ph6" to remote "/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py" <<< 8975 1727204055.27532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.27575: stderr chunk (state=3): >>><<< 8975 1727204055.27585: stdout chunk (state=3): >>><<< 8975 1727204055.27617: done transferring module to remote 8975 1727204055.27649: _low_level_execute_command(): starting 8975 1727204055.27660: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/ /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py && sleep 0' 8975 1727204055.28423: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204055.28534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.28561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.28583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.28607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.28746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.30957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.30962: stdout chunk (state=3): >>><<< 8975 1727204055.30976: stderr chunk (state=3): >>><<< 8975 1727204055.30980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.30983: _low_level_execute_command(): starting 8975 1727204055.30986: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/AnsiballZ_stat.py && sleep 0' 8975 1727204055.31974: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.31979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.32284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.32472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.32592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.49063: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 8975 1727204055.50417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204055.50673: stderr chunk (state=3): >>><<< 8975 1727204055.50678: stdout chunk (state=3): >>><<< 8975 1727204055.50681: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204055.50684: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204055.50686: _low_level_execute_command(): starting 8975 1727204055.50689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204055.2088208-11345-140688018238328/ > /dev/null 2>&1 && sleep 0' 8975 1727204055.51278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204055.51288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.51299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.51322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204055.51334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204055.51341: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204055.51358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.51365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204055.51376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204055.51383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204055.51391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.51470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.51474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204055.51481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204055.51483: stderr chunk (state=3): >>>debug2: match found <<< 8975 1727204055.51486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.51496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.51509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.51536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.51643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.53588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.53655: stderr chunk (state=3): >>><<< 8975 1727204055.53658: stdout chunk (state=3): >>><<< 8975 1727204055.53772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.53775: handler run complete 8975 1727204055.53779: attempt loop complete, returning result 8975 1727204055.53781: _execute() done 8975 1727204055.53782: dumping result to json 8975 1727204055.53784: done dumping result, returning 8975 1727204055.53786: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-9356-306d-000000000444] 8975 1727204055.53788: sending task result for task 127b8e07-fff9-9356-306d-000000000444 8975 1727204055.53859: done sending task result for task 127b8e07-fff9-9356-306d-000000000444 8975 1727204055.53863: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 8975 1727204055.53926: no more pending results, returning what we have 8975 1727204055.53930: results queue empty 8975 1727204055.53930: checking for any_errors_fatal 8975 1727204055.53936: done checking for any_errors_fatal 8975 1727204055.53937: checking for max_fail_percentage 8975 1727204055.53939: done checking for max_fail_percentage 8975 1727204055.53940: checking to see if all hosts have failed and the running result is not ok 8975 1727204055.53941: done checking to see if all hosts have failed 8975 1727204055.53941: getting the remaining hosts for this loop 8975 1727204055.53943: done getting the remaining hosts for this loop 8975 1727204055.53947: getting the next task for host managed-node2 8975 1727204055.53955: done getting next task for host managed-node2 8975 1727204055.53957: ^ task is: TASK: Set NM profile exist flag based on the profile files 8975 1727204055.53961: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204055.53967: getting variables 8975 1727204055.53969: in VariableManager get_vars() 8975 1727204055.54011: Calling all_inventory to load vars for managed-node2 8975 1727204055.54014: Calling groups_inventory to load vars for managed-node2 8975 1727204055.54016: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204055.54026: Calling all_plugins_play to load vars for managed-node2 8975 1727204055.54029: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204055.54032: Calling groups_plugins_play to load vars for managed-node2 8975 1727204055.55592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204055.57478: done with get_vars() 8975 1727204055.57506: done getting variables 8975 1727204055.57559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.427) 0:00:26.892 ***** 8975 1727204055.57587: entering _queue_task() for managed-node2/set_fact 8975 1727204055.57879: worker is 1 (out of 1 available) 8975 1727204055.57894: exiting _queue_task() for managed-node2/set_fact 8975 1727204055.57909: done queuing things up, now waiting for results queue to drain 8975 1727204055.57910: waiting for pending results... 8975 1727204055.58105: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 8975 1727204055.58191: in run() - task 127b8e07-fff9-9356-306d-000000000445 8975 1727204055.58204: variable 'ansible_search_path' from source: unknown 8975 1727204055.58208: variable 'ansible_search_path' from source: unknown 8975 1727204055.58244: calling self._execute() 8975 1727204055.58328: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.58335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.58344: variable 'omit' from source: magic vars 8975 1727204055.58654: variable 'ansible_distribution_major_version' from source: facts 8975 1727204055.58666: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204055.58764: variable 'profile_stat' from source: set_fact 8975 1727204055.58778: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204055.58782: when evaluation is False, skipping this task 8975 1727204055.58786: _execute() done 8975 1727204055.58788: dumping result to json 8975 1727204055.58791: done dumping result, returning 8975 1727204055.58802: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-9356-306d-000000000445] 8975 1727204055.58806: sending task result for task 127b8e07-fff9-9356-306d-000000000445 8975 1727204055.58905: done sending task result for task 127b8e07-fff9-9356-306d-000000000445 8975 1727204055.58908: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204055.58962: no more pending results, returning what we have 8975 1727204055.58968: results queue empty 8975 1727204055.58969: checking for any_errors_fatal 8975 1727204055.58981: done checking for any_errors_fatal 8975 1727204055.58982: checking for max_fail_percentage 8975 1727204055.58984: done checking for max_fail_percentage 8975 1727204055.58985: checking to see if all hosts have failed and the running result is not ok 8975 1727204055.58986: done checking to see if all hosts have failed 8975 1727204055.58986: getting the remaining hosts for this loop 8975 1727204055.58988: done getting the remaining hosts for this loop 8975 1727204055.58992: getting the next task for host managed-node2 8975 1727204055.59002: done getting next task for host managed-node2 8975 1727204055.59005: ^ task is: TASK: Get NM profile info 8975 1727204055.59009: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204055.59014: getting variables 8975 1727204055.59016: in VariableManager get_vars() 8975 1727204055.59060: Calling all_inventory to load vars for managed-node2 8975 1727204055.59063: Calling groups_inventory to load vars for managed-node2 8975 1727204055.59072: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204055.59084: Calling all_plugins_play to load vars for managed-node2 8975 1727204055.59087: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204055.59089: Calling groups_plugins_play to load vars for managed-node2 8975 1727204055.60692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204055.61883: done with get_vars() 8975 1727204055.61910: done getting variables 8975 1727204055.61962: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.044) 0:00:26.936 ***** 8975 1727204055.61991: entering _queue_task() for managed-node2/shell 8975 1727204055.62280: worker is 1 (out of 1 available) 8975 1727204055.62295: exiting _queue_task() for managed-node2/shell 8975 1727204055.62307: done queuing things up, now waiting for results queue to drain 8975 1727204055.62308: waiting for pending results... 8975 1727204055.62506: running TaskExecutor() for managed-node2/TASK: Get NM profile info 8975 1727204055.62607: in run() - task 127b8e07-fff9-9356-306d-000000000446 8975 1727204055.62620: variable 'ansible_search_path' from source: unknown 8975 1727204055.62624: variable 'ansible_search_path' from source: unknown 8975 1727204055.62660: calling self._execute() 8975 1727204055.62757: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.62763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.62773: variable 'omit' from source: magic vars 8975 1727204055.63084: variable 'ansible_distribution_major_version' from source: facts 8975 1727204055.63096: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204055.63101: variable 'omit' from source: magic vars 8975 1727204055.63138: variable 'omit' from source: magic vars 8975 1727204055.63218: variable 'profile' from source: include params 8975 1727204055.63222: variable 'item' from source: include params 8975 1727204055.63270: variable 'item' from source: include params 8975 1727204055.63288: variable 'omit' from source: magic vars 8975 1727204055.63332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204055.63361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204055.63380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204055.63394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.63407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204055.63434: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204055.63442: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.63445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.63522: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204055.63525: Set connection var ansible_connection to ssh 8975 1727204055.63535: Set connection var ansible_shell_executable to /bin/sh 8975 1727204055.63543: Set connection var ansible_timeout to 10 8975 1727204055.63546: Set connection var ansible_shell_type to sh 8975 1727204055.63559: Set connection var ansible_pipelining to False 8975 1727204055.63580: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.63583: variable 'ansible_connection' from source: unknown 8975 1727204055.63585: variable 'ansible_module_compression' from source: unknown 8975 1727204055.63588: variable 'ansible_shell_type' from source: unknown 8975 1727204055.63590: variable 'ansible_shell_executable' from source: unknown 8975 1727204055.63592: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204055.63597: variable 'ansible_pipelining' from source: unknown 8975 1727204055.63600: variable 'ansible_timeout' from source: unknown 8975 1727204055.63604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204055.63722: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204055.63734: variable 'omit' from source: magic vars 8975 1727204055.63739: starting attempt loop 8975 1727204055.63744: running the handler 8975 1727204055.63754: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204055.63772: _low_level_execute_command(): starting 8975 1727204055.63779: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204055.64353: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.64358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.64361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.64364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204055.64369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.64426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.64429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.64434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.64509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.66247: stdout chunk (state=3): >>>/root <<< 8975 1727204055.66358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.66420: stderr chunk (state=3): >>><<< 8975 1727204055.66427: stdout chunk (state=3): >>><<< 8975 1727204055.66450: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.66467: _low_level_execute_command(): starting 8975 1727204055.66471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198 `" && echo ansible-tmp-1727204055.6644964-11367-398207533198="` echo /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198 `" ) && sleep 0' 8975 1727204055.66971: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.66975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.66979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204055.66982: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.67041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.67047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.67050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.67115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.69114: stdout chunk (state=3): >>>ansible-tmp-1727204055.6644964-11367-398207533198=/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198 <<< 8975 1727204055.69225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.69293: stderr chunk (state=3): >>><<< 8975 1727204055.69296: stdout chunk (state=3): >>><<< 8975 1727204055.69312: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204055.6644964-11367-398207533198=/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.69342: variable 'ansible_module_compression' from source: unknown 8975 1727204055.69394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204055.69430: variable 'ansible_facts' from source: unknown 8975 1727204055.69494: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py 8975 1727204055.69616: Sending initial data 8975 1727204055.69619: Sent initial data (152 bytes) 8975 1727204055.70120: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.70126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.70131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.70133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.70187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.70191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.70198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.70268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.71869: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204055.71936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204055.72006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpdhbtnw7j /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py <<< 8975 1727204055.72016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py" <<< 8975 1727204055.72080: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpdhbtnw7j" to remote "/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py" <<< 8975 1727204055.72084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py" <<< 8975 1727204055.72744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.72822: stderr chunk (state=3): >>><<< 8975 1727204055.72826: stdout chunk (state=3): >>><<< 8975 1727204055.72846: done transferring module to remote 8975 1727204055.72862: _low_level_execute_command(): starting 8975 1727204055.72867: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/ /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py && sleep 0' 8975 1727204055.73338: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.73342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204055.73384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.73387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204055.73390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.73392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204055.73394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.73447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.73450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.73533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.75358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.75417: stderr chunk (state=3): >>><<< 8975 1727204055.75421: stdout chunk (state=3): >>><<< 8975 1727204055.75436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.75439: _low_level_execute_command(): starting 8975 1727204055.75445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/AnsiballZ_command.py && sleep 0' 8975 1727204055.75951: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204055.75955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.75957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204055.75960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204055.76019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204055.76028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.76031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.76099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.94887: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:15.925003", "end": "2024-09-24 14:54:15.947402", "delta": "0:00:00.022399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204055.96483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204055.96542: stderr chunk (state=3): >>><<< 8975 1727204055.96546: stdout chunk (state=3): >>><<< 8975 1727204055.96568: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:15.925003", "end": "2024-09-24 14:54:15.947402", "delta": "0:00:00.022399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204055.96603: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204055.96613: _low_level_execute_command(): starting 8975 1727204055.96616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204055.6644964-11367-398207533198/ > /dev/null 2>&1 && sleep 0' 8975 1727204055.97370: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204055.97375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204055.97447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204055.99336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204055.99400: stderr chunk (state=3): >>><<< 8975 1727204055.99404: stdout chunk (state=3): >>><<< 8975 1727204055.99419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204055.99426: handler run complete 8975 1727204055.99447: Evaluated conditional (False): False 8975 1727204055.99458: attempt loop complete, returning result 8975 1727204055.99461: _execute() done 8975 1727204055.99464: dumping result to json 8975 1727204055.99472: done dumping result, returning 8975 1727204055.99480: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-9356-306d-000000000446] 8975 1727204055.99485: sending task result for task 127b8e07-fff9-9356-306d-000000000446 8975 1727204055.99596: done sending task result for task 127b8e07-fff9-9356-306d-000000000446 8975 1727204055.99600: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022399", "end": "2024-09-24 14:54:15.947402", "rc": 0, "start": "2024-09-24 14:54:15.925003" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 8975 1727204055.99677: no more pending results, returning what we have 8975 1727204055.99681: results queue empty 8975 1727204055.99681: checking for any_errors_fatal 8975 1727204055.99688: done checking for any_errors_fatal 8975 1727204055.99688: checking for max_fail_percentage 8975 1727204055.99690: done checking for max_fail_percentage 8975 1727204055.99691: checking to see if all hosts have failed and the running result is not ok 8975 1727204055.99692: done checking to see if all hosts have failed 8975 1727204055.99693: getting the remaining hosts for this loop 8975 1727204055.99695: done getting the remaining hosts for this loop 8975 1727204055.99699: getting the next task for host managed-node2 8975 1727204055.99707: done getting next task for host managed-node2 8975 1727204055.99709: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204055.99713: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204055.99717: getting variables 8975 1727204055.99718: in VariableManager get_vars() 8975 1727204055.99762: Calling all_inventory to load vars for managed-node2 8975 1727204055.99764: Calling groups_inventory to load vars for managed-node2 8975 1727204055.99774: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204055.99786: Calling all_plugins_play to load vars for managed-node2 8975 1727204055.99789: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204055.99791: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.01810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.03863: done with get_vars() 8975 1727204056.03902: done getting variables 8975 1727204056.03960: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.420) 0:00:27.356 ***** 8975 1727204056.03995: entering _queue_task() for managed-node2/set_fact 8975 1727204056.04601: worker is 1 (out of 1 available) 8975 1727204056.04610: exiting _queue_task() for managed-node2/set_fact 8975 1727204056.04622: done queuing things up, now waiting for results queue to drain 8975 1727204056.04623: waiting for pending results... 8975 1727204056.04869: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 8975 1727204056.04883: in run() - task 127b8e07-fff9-9356-306d-000000000447 8975 1727204056.04912: variable 'ansible_search_path' from source: unknown 8975 1727204056.04920: variable 'ansible_search_path' from source: unknown 8975 1727204056.04976: calling self._execute() 8975 1727204056.05091: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.05121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.05141: variable 'omit' from source: magic vars 8975 1727204056.05651: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.05871: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.05989: variable 'nm_profile_exists' from source: set_fact 8975 1727204056.06011: Evaluated conditional (nm_profile_exists.rc == 0): True 8975 1727204056.06272: variable 'omit' from source: magic vars 8975 1727204056.06276: variable 'omit' from source: magic vars 8975 1727204056.06279: variable 'omit' from source: magic vars 8975 1727204056.06311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204056.06350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204056.06382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204056.06399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.06411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.06446: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204056.06449: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.06452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.06573: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204056.06576: Set connection var ansible_connection to ssh 8975 1727204056.06584: Set connection var ansible_shell_executable to /bin/sh 8975 1727204056.06590: Set connection var ansible_timeout to 10 8975 1727204056.06594: Set connection var ansible_shell_type to sh 8975 1727204056.06606: Set connection var ansible_pipelining to False 8975 1727204056.06633: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.06636: variable 'ansible_connection' from source: unknown 8975 1727204056.06638: variable 'ansible_module_compression' from source: unknown 8975 1727204056.06641: variable 'ansible_shell_type' from source: unknown 8975 1727204056.06643: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.06648: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.06652: variable 'ansible_pipelining' from source: unknown 8975 1727204056.06655: variable 'ansible_timeout' from source: unknown 8975 1727204056.06661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.06930: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204056.06934: variable 'omit' from source: magic vars 8975 1727204056.06936: starting attempt loop 8975 1727204056.06938: running the handler 8975 1727204056.06940: handler run complete 8975 1727204056.06942: attempt loop complete, returning result 8975 1727204056.06943: _execute() done 8975 1727204056.06945: dumping result to json 8975 1727204056.06947: done dumping result, returning 8975 1727204056.06949: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-9356-306d-000000000447] 8975 1727204056.06951: sending task result for task 127b8e07-fff9-9356-306d-000000000447 8975 1727204056.07023: done sending task result for task 127b8e07-fff9-9356-306d-000000000447 8975 1727204056.07026: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 8975 1727204056.07087: no more pending results, returning what we have 8975 1727204056.07090: results queue empty 8975 1727204056.07091: checking for any_errors_fatal 8975 1727204056.07101: done checking for any_errors_fatal 8975 1727204056.07102: checking for max_fail_percentage 8975 1727204056.07103: done checking for max_fail_percentage 8975 1727204056.07104: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.07106: done checking to see if all hosts have failed 8975 1727204056.07106: getting the remaining hosts for this loop 8975 1727204056.07108: done getting the remaining hosts for this loop 8975 1727204056.07113: getting the next task for host managed-node2 8975 1727204056.07123: done getting next task for host managed-node2 8975 1727204056.07125: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204056.07131: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.07135: getting variables 8975 1727204056.07137: in VariableManager get_vars() 8975 1727204056.07177: Calling all_inventory to load vars for managed-node2 8975 1727204056.07180: Calling groups_inventory to load vars for managed-node2 8975 1727204056.07182: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.07192: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.07194: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.07197: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.09337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.11644: done with get_vars() 8975 1727204056.11696: done getting variables 8975 1727204056.11761: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.11890: variable 'profile' from source: include params 8975 1727204056.11895: variable 'item' from source: include params 8975 1727204056.12037: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.080) 0:00:27.437 ***** 8975 1727204056.12088: entering _queue_task() for managed-node2/command 8975 1727204056.12540: worker is 1 (out of 1 available) 8975 1727204056.12555: exiting _queue_task() for managed-node2/command 8975 1727204056.12572: done queuing things up, now waiting for results queue to drain 8975 1727204056.12573: waiting for pending results... 8975 1727204056.12951: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 8975 1727204056.13010: in run() - task 127b8e07-fff9-9356-306d-000000000449 8975 1727204056.13023: variable 'ansible_search_path' from source: unknown 8975 1727204056.13030: variable 'ansible_search_path' from source: unknown 8975 1727204056.13079: calling self._execute() 8975 1727204056.13194: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.13198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.13207: variable 'omit' from source: magic vars 8975 1727204056.13623: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.13636: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.13774: variable 'profile_stat' from source: set_fact 8975 1727204056.13789: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204056.13792: when evaluation is False, skipping this task 8975 1727204056.13795: _execute() done 8975 1727204056.13798: dumping result to json 8975 1727204056.13800: done dumping result, returning 8975 1727204056.13810: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [127b8e07-fff9-9356-306d-000000000449] 8975 1727204056.13816: sending task result for task 127b8e07-fff9-9356-306d-000000000449 8975 1727204056.13934: done sending task result for task 127b8e07-fff9-9356-306d-000000000449 8975 1727204056.13938: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204056.14011: no more pending results, returning what we have 8975 1727204056.14016: results queue empty 8975 1727204056.14017: checking for any_errors_fatal 8975 1727204056.14024: done checking for any_errors_fatal 8975 1727204056.14025: checking for max_fail_percentage 8975 1727204056.14027: done checking for max_fail_percentage 8975 1727204056.14028: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.14030: done checking to see if all hosts have failed 8975 1727204056.14031: getting the remaining hosts for this loop 8975 1727204056.14033: done getting the remaining hosts for this loop 8975 1727204056.14268: getting the next task for host managed-node2 8975 1727204056.14277: done getting next task for host managed-node2 8975 1727204056.14280: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 8975 1727204056.14284: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.14289: getting variables 8975 1727204056.14291: in VariableManager get_vars() 8975 1727204056.14332: Calling all_inventory to load vars for managed-node2 8975 1727204056.14335: Calling groups_inventory to load vars for managed-node2 8975 1727204056.14337: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.14348: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.14351: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.14354: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.16249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.18431: done with get_vars() 8975 1727204056.18470: done getting variables 8975 1727204056.18544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.18676: variable 'profile' from source: include params 8975 1727204056.18680: variable 'item' from source: include params 8975 1727204056.18750: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.066) 0:00:27.504 ***** 8975 1727204056.18787: entering _queue_task() for managed-node2/set_fact 8975 1727204056.19212: worker is 1 (out of 1 available) 8975 1727204056.19227: exiting _queue_task() for managed-node2/set_fact 8975 1727204056.19241: done queuing things up, now waiting for results queue to drain 8975 1727204056.19243: waiting for pending results... 8975 1727204056.19695: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 8975 1727204056.19701: in run() - task 127b8e07-fff9-9356-306d-00000000044a 8975 1727204056.19707: variable 'ansible_search_path' from source: unknown 8975 1727204056.19711: variable 'ansible_search_path' from source: unknown 8975 1727204056.19832: calling self._execute() 8975 1727204056.19867: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.19880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.19888: variable 'omit' from source: magic vars 8975 1727204056.20471: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.20475: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.20480: variable 'profile_stat' from source: set_fact 8975 1727204056.20485: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204056.20488: when evaluation is False, skipping this task 8975 1727204056.20491: _execute() done 8975 1727204056.20495: dumping result to json 8975 1727204056.20498: done dumping result, returning 8975 1727204056.20506: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [127b8e07-fff9-9356-306d-00000000044a] 8975 1727204056.20514: sending task result for task 127b8e07-fff9-9356-306d-00000000044a skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204056.20674: no more pending results, returning what we have 8975 1727204056.20773: results queue empty 8975 1727204056.20774: checking for any_errors_fatal 8975 1727204056.20781: done checking for any_errors_fatal 8975 1727204056.20782: checking for max_fail_percentage 8975 1727204056.20784: done checking for max_fail_percentage 8975 1727204056.20786: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.20788: done checking to see if all hosts have failed 8975 1727204056.20789: getting the remaining hosts for this loop 8975 1727204056.20791: done getting the remaining hosts for this loop 8975 1727204056.20796: getting the next task for host managed-node2 8975 1727204056.20804: done getting next task for host managed-node2 8975 1727204056.20807: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 8975 1727204056.20813: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.20818: getting variables 8975 1727204056.20820: in VariableManager get_vars() 8975 1727204056.20981: Calling all_inventory to load vars for managed-node2 8975 1727204056.20985: Calling groups_inventory to load vars for managed-node2 8975 1727204056.20988: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.20995: done sending task result for task 127b8e07-fff9-9356-306d-00000000044a 8975 1727204056.21007: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.21010: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.21014: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.21535: WORKER PROCESS EXITING 8975 1727204056.22912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.25117: done with get_vars() 8975 1727204056.25156: done getting variables 8975 1727204056.25222: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.25383: variable 'profile' from source: include params 8975 1727204056.25388: variable 'item' from source: include params 8975 1727204056.25456: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.067) 0:00:27.571 ***** 8975 1727204056.25494: entering _queue_task() for managed-node2/command 8975 1727204056.25905: worker is 1 (out of 1 available) 8975 1727204056.25922: exiting _queue_task() for managed-node2/command 8975 1727204056.25939: done queuing things up, now waiting for results queue to drain 8975 1727204056.25941: waiting for pending results... 8975 1727204056.26385: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 8975 1727204056.26439: in run() - task 127b8e07-fff9-9356-306d-00000000044b 8975 1727204056.26468: variable 'ansible_search_path' from source: unknown 8975 1727204056.26478: variable 'ansible_search_path' from source: unknown 8975 1727204056.26529: calling self._execute() 8975 1727204056.26646: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.26659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.26678: variable 'omit' from source: magic vars 8975 1727204056.27100: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.27116: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.27262: variable 'profile_stat' from source: set_fact 8975 1727204056.27265: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204056.27274: when evaluation is False, skipping this task 8975 1727204056.27280: _execute() done 8975 1727204056.27287: dumping result to json 8975 1727204056.27373: done dumping result, returning 8975 1727204056.27376: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [127b8e07-fff9-9356-306d-00000000044b] 8975 1727204056.27379: sending task result for task 127b8e07-fff9-9356-306d-00000000044b 8975 1727204056.27457: done sending task result for task 127b8e07-fff9-9356-306d-00000000044b 8975 1727204056.27460: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204056.27537: no more pending results, returning what we have 8975 1727204056.27541: results queue empty 8975 1727204056.27542: checking for any_errors_fatal 8975 1727204056.27553: done checking for any_errors_fatal 8975 1727204056.27553: checking for max_fail_percentage 8975 1727204056.27555: done checking for max_fail_percentage 8975 1727204056.27556: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.27558: done checking to see if all hosts have failed 8975 1727204056.27559: getting the remaining hosts for this loop 8975 1727204056.27561: done getting the remaining hosts for this loop 8975 1727204056.27567: getting the next task for host managed-node2 8975 1727204056.27577: done getting next task for host managed-node2 8975 1727204056.27580: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 8975 1727204056.27585: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.27591: getting variables 8975 1727204056.27593: in VariableManager get_vars() 8975 1727204056.27647: Calling all_inventory to load vars for managed-node2 8975 1727204056.27650: Calling groups_inventory to load vars for managed-node2 8975 1727204056.27652: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.27947: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.27952: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.27957: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.29650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.31827: done with get_vars() 8975 1727204056.31872: done getting variables 8975 1727204056.31948: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.32085: variable 'profile' from source: include params 8975 1727204056.32090: variable 'item' from source: include params 8975 1727204056.32159: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.066) 0:00:27.638 ***** 8975 1727204056.32195: entering _queue_task() for managed-node2/set_fact 8975 1727204056.32603: worker is 1 (out of 1 available) 8975 1727204056.32617: exiting _queue_task() for managed-node2/set_fact 8975 1727204056.32634: done queuing things up, now waiting for results queue to drain 8975 1727204056.32636: waiting for pending results... 8975 1727204056.33001: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 8975 1727204056.33274: in run() - task 127b8e07-fff9-9356-306d-00000000044c 8975 1727204056.33278: variable 'ansible_search_path' from source: unknown 8975 1727204056.33281: variable 'ansible_search_path' from source: unknown 8975 1727204056.33284: calling self._execute() 8975 1727204056.33339: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.33353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.33372: variable 'omit' from source: magic vars 8975 1727204056.33785: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.33803: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.33956: variable 'profile_stat' from source: set_fact 8975 1727204056.33978: Evaluated conditional (profile_stat.stat.exists): False 8975 1727204056.33985: when evaluation is False, skipping this task 8975 1727204056.33991: _execute() done 8975 1727204056.33997: dumping result to json 8975 1727204056.34003: done dumping result, returning 8975 1727204056.34014: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [127b8e07-fff9-9356-306d-00000000044c] 8975 1727204056.34053: sending task result for task 127b8e07-fff9-9356-306d-00000000044c skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 8975 1727204056.34411: no more pending results, returning what we have 8975 1727204056.34415: results queue empty 8975 1727204056.34416: checking for any_errors_fatal 8975 1727204056.34422: done checking for any_errors_fatal 8975 1727204056.34423: checking for max_fail_percentage 8975 1727204056.34425: done checking for max_fail_percentage 8975 1727204056.34426: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.34429: done checking to see if all hosts have failed 8975 1727204056.34430: getting the remaining hosts for this loop 8975 1727204056.34432: done getting the remaining hosts for this loop 8975 1727204056.34435: getting the next task for host managed-node2 8975 1727204056.34445: done getting next task for host managed-node2 8975 1727204056.34448: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 8975 1727204056.34451: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.34455: getting variables 8975 1727204056.34456: in VariableManager get_vars() 8975 1727204056.34502: Calling all_inventory to load vars for managed-node2 8975 1727204056.34505: Calling groups_inventory to load vars for managed-node2 8975 1727204056.34507: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.34520: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.34523: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.34526: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.35083: done sending task result for task 127b8e07-fff9-9356-306d-00000000044c 8975 1727204056.35086: WORKER PROCESS EXITING 8975 1727204056.36651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.38910: done with get_vars() 8975 1727204056.38954: done getting variables 8975 1727204056.39025: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.39343: variable 'profile' from source: include params 8975 1727204056.39347: variable 'item' from source: include params 8975 1727204056.39413: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.072) 0:00:27.711 ***** 8975 1727204056.39451: entering _queue_task() for managed-node2/assert 8975 1727204056.39850: worker is 1 (out of 1 available) 8975 1727204056.40074: exiting _queue_task() for managed-node2/assert 8975 1727204056.40086: done queuing things up, now waiting for results queue to drain 8975 1727204056.40088: waiting for pending results... 8975 1727204056.40212: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' 8975 1727204056.40364: in run() - task 127b8e07-fff9-9356-306d-00000000026f 8975 1727204056.40390: variable 'ansible_search_path' from source: unknown 8975 1727204056.40400: variable 'ansible_search_path' from source: unknown 8975 1727204056.40456: calling self._execute() 8975 1727204056.40579: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.40593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.40611: variable 'omit' from source: magic vars 8975 1727204056.41033: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.41055: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.41071: variable 'omit' from source: magic vars 8975 1727204056.41126: variable 'omit' from source: magic vars 8975 1727204056.41432: variable 'profile' from source: include params 8975 1727204056.41436: variable 'item' from source: include params 8975 1727204056.41535: variable 'item' from source: include params 8975 1727204056.41652: variable 'omit' from source: magic vars 8975 1727204056.41690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204056.42071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204056.42074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204056.42077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.42080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.42083: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204056.42086: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.42088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.42091: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204056.42093: Set connection var ansible_connection to ssh 8975 1727204056.42095: Set connection var ansible_shell_executable to /bin/sh 8975 1727204056.42098: Set connection var ansible_timeout to 10 8975 1727204056.42101: Set connection var ansible_shell_type to sh 8975 1727204056.42104: Set connection var ansible_pipelining to False 8975 1727204056.42106: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.42108: variable 'ansible_connection' from source: unknown 8975 1727204056.42110: variable 'ansible_module_compression' from source: unknown 8975 1727204056.42112: variable 'ansible_shell_type' from source: unknown 8975 1727204056.42114: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.42116: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.42118: variable 'ansible_pipelining' from source: unknown 8975 1727204056.42121: variable 'ansible_timeout' from source: unknown 8975 1727204056.42123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.42347: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204056.42351: variable 'omit' from source: magic vars 8975 1727204056.42354: starting attempt loop 8975 1727204056.42356: running the handler 8975 1727204056.42433: variable 'lsr_net_profile_exists' from source: set_fact 8975 1727204056.42436: Evaluated conditional (lsr_net_profile_exists): True 8975 1727204056.42442: handler run complete 8975 1727204056.42477: attempt loop complete, returning result 8975 1727204056.42480: _execute() done 8975 1727204056.42483: dumping result to json 8975 1727204056.42486: done dumping result, returning 8975 1727204056.42492: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' [127b8e07-fff9-9356-306d-00000000026f] 8975 1727204056.42504: sending task result for task 127b8e07-fff9-9356-306d-00000000026f 8975 1727204056.42751: done sending task result for task 127b8e07-fff9-9356-306d-00000000026f 8975 1727204056.42754: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204056.42823: no more pending results, returning what we have 8975 1727204056.42826: results queue empty 8975 1727204056.42830: checking for any_errors_fatal 8975 1727204056.42835: done checking for any_errors_fatal 8975 1727204056.42836: checking for max_fail_percentage 8975 1727204056.42838: done checking for max_fail_percentage 8975 1727204056.42839: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.42840: done checking to see if all hosts have failed 8975 1727204056.42841: getting the remaining hosts for this loop 8975 1727204056.42843: done getting the remaining hosts for this loop 8975 1727204056.42847: getting the next task for host managed-node2 8975 1727204056.42854: done getting next task for host managed-node2 8975 1727204056.42857: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 8975 1727204056.42860: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.42863: getting variables 8975 1727204056.42865: in VariableManager get_vars() 8975 1727204056.42906: Calling all_inventory to load vars for managed-node2 8975 1727204056.42909: Calling groups_inventory to load vars for managed-node2 8975 1727204056.42911: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.42923: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.42926: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.42932: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.46021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.48171: done with get_vars() 8975 1727204056.48210: done getting variables 8975 1727204056.48286: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.48413: variable 'profile' from source: include params 8975 1727204056.48418: variable 'item' from source: include params 8975 1727204056.48482: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.090) 0:00:27.802 ***** 8975 1727204056.48521: entering _queue_task() for managed-node2/assert 8975 1727204056.48917: worker is 1 (out of 1 available) 8975 1727204056.48933: exiting _queue_task() for managed-node2/assert 8975 1727204056.48947: done queuing things up, now waiting for results queue to drain 8975 1727204056.48949: waiting for pending results... 8975 1727204056.49268: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 8975 1727204056.49573: in run() - task 127b8e07-fff9-9356-306d-000000000270 8975 1727204056.49578: variable 'ansible_search_path' from source: unknown 8975 1727204056.49581: variable 'ansible_search_path' from source: unknown 8975 1727204056.49585: calling self._execute() 8975 1727204056.49625: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.49643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.49659: variable 'omit' from source: magic vars 8975 1727204056.50110: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.50134: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.50150: variable 'omit' from source: magic vars 8975 1727204056.50199: variable 'omit' from source: magic vars 8975 1727204056.50333: variable 'profile' from source: include params 8975 1727204056.50359: variable 'item' from source: include params 8975 1727204056.50468: variable 'item' from source: include params 8975 1727204056.50473: variable 'omit' from source: magic vars 8975 1727204056.50517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204056.50574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204056.50603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204056.50631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.50684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.50697: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204056.50706: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.50713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.50845: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204056.50900: Set connection var ansible_connection to ssh 8975 1727204056.50903: Set connection var ansible_shell_executable to /bin/sh 8975 1727204056.50906: Set connection var ansible_timeout to 10 8975 1727204056.50909: Set connection var ansible_shell_type to sh 8975 1727204056.50911: Set connection var ansible_pipelining to False 8975 1727204056.50932: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.50940: variable 'ansible_connection' from source: unknown 8975 1727204056.50947: variable 'ansible_module_compression' from source: unknown 8975 1727204056.50953: variable 'ansible_shell_type' from source: unknown 8975 1727204056.50960: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.50969: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.50978: variable 'ansible_pipelining' from source: unknown 8975 1727204056.51008: variable 'ansible_timeout' from source: unknown 8975 1727204056.51011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.51164: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204056.51186: variable 'omit' from source: magic vars 8975 1727204056.51225: starting attempt loop 8975 1727204056.51231: running the handler 8975 1727204056.51356: variable 'lsr_net_profile_ansible_managed' from source: set_fact 8975 1727204056.51370: Evaluated conditional (lsr_net_profile_ansible_managed): True 8975 1727204056.51444: handler run complete 8975 1727204056.51448: attempt loop complete, returning result 8975 1727204056.51450: _execute() done 8975 1727204056.51453: dumping result to json 8975 1727204056.51455: done dumping result, returning 8975 1727204056.51458: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [127b8e07-fff9-9356-306d-000000000270] 8975 1727204056.51460: sending task result for task 127b8e07-fff9-9356-306d-000000000270 8975 1727204056.51681: done sending task result for task 127b8e07-fff9-9356-306d-000000000270 8975 1727204056.51685: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204056.51742: no more pending results, returning what we have 8975 1727204056.51746: results queue empty 8975 1727204056.51747: checking for any_errors_fatal 8975 1727204056.51756: done checking for any_errors_fatal 8975 1727204056.51757: checking for max_fail_percentage 8975 1727204056.51759: done checking for max_fail_percentage 8975 1727204056.51761: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.51762: done checking to see if all hosts have failed 8975 1727204056.51763: getting the remaining hosts for this loop 8975 1727204056.51766: done getting the remaining hosts for this loop 8975 1727204056.51771: getting the next task for host managed-node2 8975 1727204056.51780: done getting next task for host managed-node2 8975 1727204056.51783: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 8975 1727204056.51787: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.51792: getting variables 8975 1727204056.51794: in VariableManager get_vars() 8975 1727204056.51846: Calling all_inventory to load vars for managed-node2 8975 1727204056.51849: Calling groups_inventory to load vars for managed-node2 8975 1727204056.51852: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.52068: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.52073: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.52078: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.53832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.56033: done with get_vars() 8975 1727204056.56076: done getting variables 8975 1727204056.56149: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204056.56275: variable 'profile' from source: include params 8975 1727204056.56280: variable 'item' from source: include params 8975 1727204056.56344: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.078) 0:00:27.880 ***** 8975 1727204056.56387: entering _queue_task() for managed-node2/assert 8975 1727204056.56779: worker is 1 (out of 1 available) 8975 1727204056.56794: exiting _queue_task() for managed-node2/assert 8975 1727204056.56808: done queuing things up, now waiting for results queue to drain 8975 1727204056.56810: waiting for pending results... 8975 1727204056.57121: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 8975 1727204056.57262: in run() - task 127b8e07-fff9-9356-306d-000000000271 8975 1727204056.57471: variable 'ansible_search_path' from source: unknown 8975 1727204056.57476: variable 'ansible_search_path' from source: unknown 8975 1727204056.57479: calling self._execute() 8975 1727204056.57483: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.57486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.57500: variable 'omit' from source: magic vars 8975 1727204056.57930: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.57953: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.58047: variable 'omit' from source: magic vars 8975 1727204056.58051: variable 'omit' from source: magic vars 8975 1727204056.58144: variable 'profile' from source: include params 8975 1727204056.58159: variable 'item' from source: include params 8975 1727204056.58234: variable 'item' from source: include params 8975 1727204056.58267: variable 'omit' from source: magic vars 8975 1727204056.58319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204056.58369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204056.58402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204056.58431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.58451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.58497: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204056.58506: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.58514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.58664: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204056.58671: Set connection var ansible_connection to ssh 8975 1727204056.58674: Set connection var ansible_shell_executable to /bin/sh 8975 1727204056.58676: Set connection var ansible_timeout to 10 8975 1727204056.58678: Set connection var ansible_shell_type to sh 8975 1727204056.58680: Set connection var ansible_pipelining to False 8975 1727204056.58696: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.58702: variable 'ansible_connection' from source: unknown 8975 1727204056.58711: variable 'ansible_module_compression' from source: unknown 8975 1727204056.58714: variable 'ansible_shell_type' from source: unknown 8975 1727204056.58716: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.58739: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.58744: variable 'ansible_pipelining' from source: unknown 8975 1727204056.58747: variable 'ansible_timeout' from source: unknown 8975 1727204056.58749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.58868: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204056.58879: variable 'omit' from source: magic vars 8975 1727204056.58884: starting attempt loop 8975 1727204056.58887: running the handler 8975 1727204056.58988: variable 'lsr_net_profile_fingerprint' from source: set_fact 8975 1727204056.58992: Evaluated conditional (lsr_net_profile_fingerprint): True 8975 1727204056.59000: handler run complete 8975 1727204056.59013: attempt loop complete, returning result 8975 1727204056.59016: _execute() done 8975 1727204056.59019: dumping result to json 8975 1727204056.59022: done dumping result, returning 8975 1727204056.59030: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 [127b8e07-fff9-9356-306d-000000000271] 8975 1727204056.59038: sending task result for task 127b8e07-fff9-9356-306d-000000000271 8975 1727204056.59135: done sending task result for task 127b8e07-fff9-9356-306d-000000000271 8975 1727204056.59138: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 8975 1727204056.59197: no more pending results, returning what we have 8975 1727204056.59200: results queue empty 8975 1727204056.59201: checking for any_errors_fatal 8975 1727204056.59211: done checking for any_errors_fatal 8975 1727204056.59212: checking for max_fail_percentage 8975 1727204056.59214: done checking for max_fail_percentage 8975 1727204056.59215: checking to see if all hosts have failed and the running result is not ok 8975 1727204056.59216: done checking to see if all hosts have failed 8975 1727204056.59217: getting the remaining hosts for this loop 8975 1727204056.59219: done getting the remaining hosts for this loop 8975 1727204056.59223: getting the next task for host managed-node2 8975 1727204056.59234: done getting next task for host managed-node2 8975 1727204056.59236: ^ task is: TASK: ** TEST check polling interval 8975 1727204056.59239: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204056.59244: getting variables 8975 1727204056.59245: in VariableManager get_vars() 8975 1727204056.59297: Calling all_inventory to load vars for managed-node2 8975 1727204056.59300: Calling groups_inventory to load vars for managed-node2 8975 1727204056.59302: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204056.59313: Calling all_plugins_play to load vars for managed-node2 8975 1727204056.59316: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204056.59318: Calling groups_plugins_play to load vars for managed-node2 8975 1727204056.60430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204056.62192: done with get_vars() 8975 1727204056.62221: done getting variables 8975 1727204056.62278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.059) 0:00:27.939 ***** 8975 1727204056.62302: entering _queue_task() for managed-node2/command 8975 1727204056.62594: worker is 1 (out of 1 available) 8975 1727204056.62610: exiting _queue_task() for managed-node2/command 8975 1727204056.62625: done queuing things up, now waiting for results queue to drain 8975 1727204056.62627: waiting for pending results... 8975 1727204056.62818: running TaskExecutor() for managed-node2/TASK: ** TEST check polling interval 8975 1727204056.62895: in run() - task 127b8e07-fff9-9356-306d-000000000071 8975 1727204056.62909: variable 'ansible_search_path' from source: unknown 8975 1727204056.62943: calling self._execute() 8975 1727204056.63030: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.63034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.63044: variable 'omit' from source: magic vars 8975 1727204056.63347: variable 'ansible_distribution_major_version' from source: facts 8975 1727204056.63356: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204056.63363: variable 'omit' from source: magic vars 8975 1727204056.63382: variable 'omit' from source: magic vars 8975 1727204056.63460: variable 'controller_device' from source: play vars 8975 1727204056.63478: variable 'omit' from source: magic vars 8975 1727204056.63520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204056.63551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204056.63571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204056.63586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.63596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204056.63630: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204056.63634: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.63637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.63712: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204056.63716: Set connection var ansible_connection to ssh 8975 1727204056.63720: Set connection var ansible_shell_executable to /bin/sh 8975 1727204056.63732: Set connection var ansible_timeout to 10 8975 1727204056.63734: Set connection var ansible_shell_type to sh 8975 1727204056.63742: Set connection var ansible_pipelining to False 8975 1727204056.63761: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.63764: variable 'ansible_connection' from source: unknown 8975 1727204056.63769: variable 'ansible_module_compression' from source: unknown 8975 1727204056.63771: variable 'ansible_shell_type' from source: unknown 8975 1727204056.63774: variable 'ansible_shell_executable' from source: unknown 8975 1727204056.63777: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204056.63781: variable 'ansible_pipelining' from source: unknown 8975 1727204056.63783: variable 'ansible_timeout' from source: unknown 8975 1727204056.63788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204056.63903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204056.63914: variable 'omit' from source: magic vars 8975 1727204056.63919: starting attempt loop 8975 1727204056.63922: running the handler 8975 1727204056.63955: _low_level_execute_command(): starting 8975 1727204056.63958: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204056.64715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204056.64720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204056.64723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204056.64726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204056.64747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204056.64751: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204056.64753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.64958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204056.64962: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204056.64968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204056.64971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204056.64974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204056.64977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204056.64980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204056.64983: stderr chunk (state=3): >>>debug2: match found <<< 8975 1727204056.64986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.64989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.64992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.64995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.65046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204056.66851: stdout chunk (state=3): >>>/root <<< 8975 1727204056.67078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204056.67083: stdout chunk (state=3): >>><<< 8975 1727204056.67085: stderr chunk (state=3): >>><<< 8975 1727204056.67110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204056.67138: _low_level_execute_command(): starting 8975 1727204056.67188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302 `" && echo ansible-tmp-1727204056.6711743-11405-188382292256302="` echo /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302 `" ) && sleep 0' 8975 1727204056.67918: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204056.68024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.68054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.68076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.68099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.68229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204056.70222: stdout chunk (state=3): >>>ansible-tmp-1727204056.6711743-11405-188382292256302=/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302 <<< 8975 1727204056.70454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204056.70458: stdout chunk (state=3): >>><<< 8975 1727204056.70461: stderr chunk (state=3): >>><<< 8975 1727204056.70484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204056.6711743-11405-188382292256302=/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204056.70674: variable 'ansible_module_compression' from source: unknown 8975 1727204056.70678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204056.70680: variable 'ansible_facts' from source: unknown 8975 1727204056.70733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py 8975 1727204056.70915: Sending initial data 8975 1727204056.71026: Sent initial data (155 bytes) 8975 1727204056.71693: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204056.71710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204056.71725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204056.71791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.71853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.71875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.71910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.72024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204056.73656: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204056.73773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204056.73865: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpyyyw4noe /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py <<< 8975 1727204056.73871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py" <<< 8975 1727204056.73934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpyyyw4noe" to remote "/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py" <<< 8975 1727204056.75040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204056.75186: stderr chunk (state=3): >>><<< 8975 1727204056.75189: stdout chunk (state=3): >>><<< 8975 1727204056.75192: done transferring module to remote 8975 1727204056.75194: _low_level_execute_command(): starting 8975 1727204056.75196: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/ /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py && sleep 0' 8975 1727204056.75960: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.75999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.76022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.76122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204056.78080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204056.78105: stdout chunk (state=3): >>><<< 8975 1727204056.78109: stderr chunk (state=3): >>><<< 8975 1727204056.78217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204056.78221: _low_level_execute_command(): starting 8975 1727204056.78225: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/AnsiballZ_command.py && sleep 0' 8975 1727204056.78896: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204056.78925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204056.78984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.79069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.79090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.79115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.79232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204056.96170: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:16.956860", "end": "2024-09-24 14:54:16.960358", "delta": "0:00:00.003498", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204056.97692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204056.97750: stderr chunk (state=3): >>><<< 8975 1727204056.97753: stdout chunk (state=3): >>><<< 8975 1727204056.97770: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:16.956860", "end": "2024-09-24 14:54:16.960358", "delta": "0:00:00.003498", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204056.97805: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204056.97812: _low_level_execute_command(): starting 8975 1727204056.97817: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204056.6711743-11405-188382292256302/ > /dev/null 2>&1 && sleep 0' 8975 1727204056.98320: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204056.98324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.98327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204056.98337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204056.98340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204056.98382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204056.98385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204056.98388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204056.98463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.00378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.00428: stderr chunk (state=3): >>><<< 8975 1727204057.00432: stdout chunk (state=3): >>><<< 8975 1727204057.00447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.00455: handler run complete 8975 1727204057.00478: Evaluated conditional (False): False 8975 1727204057.00605: variable 'result' from source: unknown 8975 1727204057.00620: Evaluated conditional ('110' in result.stdout): True 8975 1727204057.00630: attempt loop complete, returning result 8975 1727204057.00635: _execute() done 8975 1727204057.00638: dumping result to json 8975 1727204057.00644: done dumping result, returning 8975 1727204057.00652: done running TaskExecutor() for managed-node2/TASK: ** TEST check polling interval [127b8e07-fff9-9356-306d-000000000071] 8975 1727204057.00657: sending task result for task 127b8e07-fff9-9356-306d-000000000071 8975 1727204057.00768: done sending task result for task 127b8e07-fff9-9356-306d-000000000071 8975 1727204057.00771: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003498", "end": "2024-09-24 14:54:16.960358", "rc": 0, "start": "2024-09-24 14:54:16.956860" } STDOUT: MII Polling Interval (ms): 110 8975 1727204057.00874: no more pending results, returning what we have 8975 1727204057.00878: results queue empty 8975 1727204057.00879: checking for any_errors_fatal 8975 1727204057.00891: done checking for any_errors_fatal 8975 1727204057.00892: checking for max_fail_percentage 8975 1727204057.00894: done checking for max_fail_percentage 8975 1727204057.00895: checking to see if all hosts have failed and the running result is not ok 8975 1727204057.00896: done checking to see if all hosts have failed 8975 1727204057.00896: getting the remaining hosts for this loop 8975 1727204057.00898: done getting the remaining hosts for this loop 8975 1727204057.00903: getting the next task for host managed-node2 8975 1727204057.00910: done getting next task for host managed-node2 8975 1727204057.00913: ^ task is: TASK: ** TEST check IPv4 8975 1727204057.00915: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204057.00918: getting variables 8975 1727204057.00919: in VariableManager get_vars() 8975 1727204057.00958: Calling all_inventory to load vars for managed-node2 8975 1727204057.00960: Calling groups_inventory to load vars for managed-node2 8975 1727204057.00962: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204057.00976: Calling all_plugins_play to load vars for managed-node2 8975 1727204057.00978: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204057.00981: Calling groups_plugins_play to load vars for managed-node2 8975 1727204057.02063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204057.03255: done with get_vars() 8975 1727204057.03290: done getting variables 8975 1727204057.03342: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.410) 0:00:28.350 ***** 8975 1727204057.03369: entering _queue_task() for managed-node2/command 8975 1727204057.03668: worker is 1 (out of 1 available) 8975 1727204057.03685: exiting _queue_task() for managed-node2/command 8975 1727204057.03702: done queuing things up, now waiting for results queue to drain 8975 1727204057.03703: waiting for pending results... 8975 1727204057.03890: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 8975 1727204057.03971: in run() - task 127b8e07-fff9-9356-306d-000000000072 8975 1727204057.03983: variable 'ansible_search_path' from source: unknown 8975 1727204057.04018: calling self._execute() 8975 1727204057.04102: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.04107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.04116: variable 'omit' from source: magic vars 8975 1727204057.04419: variable 'ansible_distribution_major_version' from source: facts 8975 1727204057.04437: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204057.04441: variable 'omit' from source: magic vars 8975 1727204057.04457: variable 'omit' from source: magic vars 8975 1727204057.04539: variable 'controller_device' from source: play vars 8975 1727204057.04557: variable 'omit' from source: magic vars 8975 1727204057.04601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204057.04634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204057.04649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204057.04665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204057.04676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204057.04707: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204057.04710: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.04712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.04791: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204057.04797: Set connection var ansible_connection to ssh 8975 1727204057.04800: Set connection var ansible_shell_executable to /bin/sh 8975 1727204057.04812: Set connection var ansible_timeout to 10 8975 1727204057.04815: Set connection var ansible_shell_type to sh 8975 1727204057.04822: Set connection var ansible_pipelining to False 8975 1727204057.04842: variable 'ansible_shell_executable' from source: unknown 8975 1727204057.04846: variable 'ansible_connection' from source: unknown 8975 1727204057.04849: variable 'ansible_module_compression' from source: unknown 8975 1727204057.04851: variable 'ansible_shell_type' from source: unknown 8975 1727204057.04854: variable 'ansible_shell_executable' from source: unknown 8975 1727204057.04856: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.04859: variable 'ansible_pipelining' from source: unknown 8975 1727204057.04861: variable 'ansible_timeout' from source: unknown 8975 1727204057.04868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.04982: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204057.04992: variable 'omit' from source: magic vars 8975 1727204057.04997: starting attempt loop 8975 1727204057.05000: running the handler 8975 1727204057.05014: _low_level_execute_command(): starting 8975 1727204057.05025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204057.05573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.05579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.05602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.05648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.05652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.05658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.05730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.07384: stdout chunk (state=3): >>>/root <<< 8975 1727204057.07488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.07559: stderr chunk (state=3): >>><<< 8975 1727204057.07562: stdout chunk (state=3): >>><<< 8975 1727204057.07588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.07599: _low_level_execute_command(): starting 8975 1727204057.07607: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114 `" && echo ansible-tmp-1727204057.0758731-11417-242864260890114="` echo /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114 `" ) && sleep 0' 8975 1727204057.08124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.08129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.08132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.08142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.08193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.08197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.08274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.10243: stdout chunk (state=3): >>>ansible-tmp-1727204057.0758731-11417-242864260890114=/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114 <<< 8975 1727204057.10354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.10414: stderr chunk (state=3): >>><<< 8975 1727204057.10417: stdout chunk (state=3): >>><<< 8975 1727204057.10440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.0758731-11417-242864260890114=/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.10472: variable 'ansible_module_compression' from source: unknown 8975 1727204057.10514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204057.10548: variable 'ansible_facts' from source: unknown 8975 1727204057.10610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py 8975 1727204057.10731: Sending initial data 8975 1727204057.10734: Sent initial data (155 bytes) 8975 1727204057.11252: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.11256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.11258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.11260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.11263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.11318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.11321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.11326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.11399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.12996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204057.13068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204057.13135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpqgipkctj /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py <<< 8975 1727204057.13138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py" <<< 8975 1727204057.13202: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpqgipkctj" to remote "/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py" <<< 8975 1727204057.13209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py" <<< 8975 1727204057.13882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.13963: stderr chunk (state=3): >>><<< 8975 1727204057.13968: stdout chunk (state=3): >>><<< 8975 1727204057.13988: done transferring module to remote 8975 1727204057.13998: _low_level_execute_command(): starting 8975 1727204057.14004: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/ /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py && sleep 0' 8975 1727204057.14505: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.14508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.14511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.14513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.14565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.14574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.14579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.14644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.16456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.16512: stderr chunk (state=3): >>><<< 8975 1727204057.16515: stdout chunk (state=3): >>><<< 8975 1727204057.16533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.16537: _low_level_execute_command(): starting 8975 1727204057.16539: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/AnsiballZ_command.py && sleep 0' 8975 1727204057.17051: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.17055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.17057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204057.17060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.17062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204057.17064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.17115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.17120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.17123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.17196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.33863: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.40/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:17.333550", "end": "2024-09-24 14:54:17.337254", "delta": "0:00:00.003704", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204057.35328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204057.35391: stderr chunk (state=3): >>><<< 8975 1727204057.35395: stdout chunk (state=3): >>><<< 8975 1727204057.35410: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.40/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:17.333550", "end": "2024-09-24 14:54:17.337254", "delta": "0:00:00.003704", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204057.35445: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204057.35453: _low_level_execute_command(): starting 8975 1727204057.35461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.0758731-11417-242864260890114/ > /dev/null 2>&1 && sleep 0' 8975 1727204057.35939: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.35949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204057.35978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.35982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.35984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.36046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.36051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.36132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.38034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.38094: stderr chunk (state=3): >>><<< 8975 1727204057.38097: stdout chunk (state=3): >>><<< 8975 1727204057.38110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.38117: handler run complete 8975 1727204057.38142: Evaluated conditional (False): False 8975 1727204057.38262: variable 'result' from source: set_fact 8975 1727204057.38281: Evaluated conditional ('192.0.2' in result.stdout): True 8975 1727204057.38291: attempt loop complete, returning result 8975 1727204057.38294: _execute() done 8975 1727204057.38297: dumping result to json 8975 1727204057.38302: done dumping result, returning 8975 1727204057.38309: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 [127b8e07-fff9-9356-306d-000000000072] 8975 1727204057.38315: sending task result for task 127b8e07-fff9-9356-306d-000000000072 8975 1727204057.38427: done sending task result for task 127b8e07-fff9-9356-306d-000000000072 8975 1727204057.38432: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003704", "end": "2024-09-24 14:54:17.337254", "rc": 0, "start": "2024-09-24 14:54:17.333550" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.40/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 235sec preferred_lft 235sec 8975 1727204057.38516: no more pending results, returning what we have 8975 1727204057.38520: results queue empty 8975 1727204057.38520: checking for any_errors_fatal 8975 1727204057.38533: done checking for any_errors_fatal 8975 1727204057.38534: checking for max_fail_percentage 8975 1727204057.38535: done checking for max_fail_percentage 8975 1727204057.38536: checking to see if all hosts have failed and the running result is not ok 8975 1727204057.38538: done checking to see if all hosts have failed 8975 1727204057.38538: getting the remaining hosts for this loop 8975 1727204057.38542: done getting the remaining hosts for this loop 8975 1727204057.38546: getting the next task for host managed-node2 8975 1727204057.38553: done getting next task for host managed-node2 8975 1727204057.38556: ^ task is: TASK: ** TEST check IPv6 8975 1727204057.38558: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204057.38561: getting variables 8975 1727204057.38562: in VariableManager get_vars() 8975 1727204057.38604: Calling all_inventory to load vars for managed-node2 8975 1727204057.38607: Calling groups_inventory to load vars for managed-node2 8975 1727204057.38609: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204057.38620: Calling all_plugins_play to load vars for managed-node2 8975 1727204057.38623: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204057.38625: Calling groups_plugins_play to load vars for managed-node2 8975 1727204057.39638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204057.40817: done with get_vars() 8975 1727204057.40850: done getting variables 8975 1727204057.40904: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.375) 0:00:28.726 ***** 8975 1727204057.40927: entering _queue_task() for managed-node2/command 8975 1727204057.41221: worker is 1 (out of 1 available) 8975 1727204057.41235: exiting _queue_task() for managed-node2/command 8975 1727204057.41249: done queuing things up, now waiting for results queue to drain 8975 1727204057.41251: waiting for pending results... 8975 1727204057.41444: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 8975 1727204057.41519: in run() - task 127b8e07-fff9-9356-306d-000000000073 8975 1727204057.41534: variable 'ansible_search_path' from source: unknown 8975 1727204057.41569: calling self._execute() 8975 1727204057.41656: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.41660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.41672: variable 'omit' from source: magic vars 8975 1727204057.41979: variable 'ansible_distribution_major_version' from source: facts 8975 1727204057.41988: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204057.41995: variable 'omit' from source: magic vars 8975 1727204057.42012: variable 'omit' from source: magic vars 8975 1727204057.42089: variable 'controller_device' from source: play vars 8975 1727204057.42104: variable 'omit' from source: magic vars 8975 1727204057.42151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204057.42183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204057.42202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204057.42217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204057.42227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204057.42260: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204057.42263: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.42267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.42343: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204057.42348: Set connection var ansible_connection to ssh 8975 1727204057.42351: Set connection var ansible_shell_executable to /bin/sh 8975 1727204057.42357: Set connection var ansible_timeout to 10 8975 1727204057.42359: Set connection var ansible_shell_type to sh 8975 1727204057.42374: Set connection var ansible_pipelining to False 8975 1727204057.42391: variable 'ansible_shell_executable' from source: unknown 8975 1727204057.42394: variable 'ansible_connection' from source: unknown 8975 1727204057.42398: variable 'ansible_module_compression' from source: unknown 8975 1727204057.42400: variable 'ansible_shell_type' from source: unknown 8975 1727204057.42402: variable 'ansible_shell_executable' from source: unknown 8975 1727204057.42405: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.42410: variable 'ansible_pipelining' from source: unknown 8975 1727204057.42413: variable 'ansible_timeout' from source: unknown 8975 1727204057.42417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.42536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204057.42546: variable 'omit' from source: magic vars 8975 1727204057.42551: starting attempt loop 8975 1727204057.42555: running the handler 8975 1727204057.42571: _low_level_execute_command(): starting 8975 1727204057.42579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204057.43117: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.43151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.43202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.43206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.43208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.43286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.44932: stdout chunk (state=3): >>>/root <<< 8975 1727204057.45035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.45106: stderr chunk (state=3): >>><<< 8975 1727204057.45110: stdout chunk (state=3): >>><<< 8975 1727204057.45135: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.45144: _low_level_execute_command(): starting 8975 1727204057.45152: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632 `" && echo ansible-tmp-1727204057.4513195-11425-251858406672632="` echo /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632 `" ) && sleep 0' 8975 1727204057.45821: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.45879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204057.45898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.45983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.46015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.46137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.48124: stdout chunk (state=3): >>>ansible-tmp-1727204057.4513195-11425-251858406672632=/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632 <<< 8975 1727204057.48353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.48358: stdout chunk (state=3): >>><<< 8975 1727204057.48361: stderr chunk (state=3): >>><<< 8975 1727204057.48424: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.4513195-11425-251858406672632=/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.48445: variable 'ansible_module_compression' from source: unknown 8975 1727204057.48509: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204057.48564: variable 'ansible_facts' from source: unknown 8975 1727204057.48670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py 8975 1727204057.48886: Sending initial data 8975 1727204057.48889: Sent initial data (155 bytes) 8975 1727204057.49431: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.49448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.49516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.49524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.49529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.49593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.51191: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204057.51266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204057.51337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp598kgoag /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py <<< 8975 1727204057.51340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py" <<< 8975 1727204057.51404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp598kgoag" to remote "/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py" <<< 8975 1727204057.51412: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py" <<< 8975 1727204057.52077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.52217: stderr chunk (state=3): >>><<< 8975 1727204057.52221: stdout chunk (state=3): >>><<< 8975 1727204057.52223: done transferring module to remote 8975 1727204057.52225: _low_level_execute_command(): starting 8975 1727204057.52230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/ /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py && sleep 0' 8975 1727204057.52733: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.52737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.52740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204057.52742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204057.52744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.52801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.52804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.52878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.54816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.54820: stdout chunk (state=3): >>><<< 8975 1727204057.54823: stderr chunk (state=3): >>><<< 8975 1727204057.54825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.54830: _low_level_execute_command(): starting 8975 1727204057.54833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/AnsiballZ_command.py && sleep 0' 8975 1727204057.55548: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204057.55555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.55559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204057.55562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204057.55564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204057.55659: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204057.55663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.55670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.55673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.55682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.55716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.55831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.72783: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::136/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::9c7f:b1ff:feea:51c2/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::9c7f:b1ff:feea:51c2/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:17.722649", "end": "2024-09-24 14:54:17.726406", "delta": "0:00:00.003757", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204057.74439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204057.74532: stderr chunk (state=3): >>><<< 8975 1727204057.74536: stdout chunk (state=3): >>><<< 8975 1727204057.74811: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::136/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::9c7f:b1ff:feea:51c2/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::9c7f:b1ff:feea:51c2/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:17.722649", "end": "2024-09-24 14:54:17.726406", "delta": "0:00:00.003757", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204057.74817: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204057.74820: _low_level_execute_command(): starting 8975 1727204057.74823: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.4513195-11425-251858406672632/ > /dev/null 2>&1 && sleep 0' 8975 1727204057.75727: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204057.75751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204057.75807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204057.75823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204057.75898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204057.75930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204057.75972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204057.76042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204057.78148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204057.78597: stdout chunk (state=3): >>><<< 8975 1727204057.78601: stderr chunk (state=3): >>><<< 8975 1727204057.78604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204057.78607: handler run complete 8975 1727204057.78609: Evaluated conditional (False): False 8975 1727204057.78728: variable 'result' from source: set_fact 8975 1727204057.78753: Evaluated conditional ('2001' in result.stdout): True 8975 1727204057.78814: attempt loop complete, returning result 8975 1727204057.78823: _execute() done 8975 1727204057.78830: dumping result to json 8975 1727204057.78880: done dumping result, returning 8975 1727204057.78893: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 [127b8e07-fff9-9356-306d-000000000073] 8975 1727204057.78907: sending task result for task 127b8e07-fff9-9356-306d-000000000073 ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003757", "end": "2024-09-24 14:54:17.726406", "rc": 0, "start": "2024-09-24 14:54:17.722649" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::136/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::9c7f:b1ff:feea:51c2/64 scope global dynamic noprefixroute valid_lft 1794sec preferred_lft 1794sec inet6 fe80::9c7f:b1ff:feea:51c2/64 scope link noprefixroute valid_lft forever preferred_lft forever 8975 1727204057.79177: no more pending results, returning what we have 8975 1727204057.79181: results queue empty 8975 1727204057.79182: checking for any_errors_fatal 8975 1727204057.79191: done checking for any_errors_fatal 8975 1727204057.79192: checking for max_fail_percentage 8975 1727204057.79193: done checking for max_fail_percentage 8975 1727204057.79195: checking to see if all hosts have failed and the running result is not ok 8975 1727204057.79196: done checking to see if all hosts have failed 8975 1727204057.79197: getting the remaining hosts for this loop 8975 1727204057.79199: done getting the remaining hosts for this loop 8975 1727204057.79203: getting the next task for host managed-node2 8975 1727204057.79216: done getting next task for host managed-node2 8975 1727204057.79222: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8975 1727204057.79227: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204057.79248: getting variables 8975 1727204057.79249: in VariableManager get_vars() 8975 1727204057.79415: Calling all_inventory to load vars for managed-node2 8975 1727204057.79419: Calling groups_inventory to load vars for managed-node2 8975 1727204057.79422: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204057.79504: Calling all_plugins_play to load vars for managed-node2 8975 1727204057.79508: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204057.79511: Calling groups_plugins_play to load vars for managed-node2 8975 1727204057.80116: done sending task result for task 127b8e07-fff9-9356-306d-000000000073 8975 1727204057.80121: WORKER PROCESS EXITING 8975 1727204057.89496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204057.92285: done with get_vars() 8975 1727204057.92326: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.514) 0:00:29.241 ***** 8975 1727204057.92425: entering _queue_task() for managed-node2/include_tasks 8975 1727204057.92824: worker is 1 (out of 1 available) 8975 1727204057.92842: exiting _queue_task() for managed-node2/include_tasks 8975 1727204057.92856: done queuing things up, now waiting for results queue to drain 8975 1727204057.92858: waiting for pending results... 8975 1727204057.93085: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 8975 1727204057.93214: in run() - task 127b8e07-fff9-9356-306d-00000000007d 8975 1727204057.93226: variable 'ansible_search_path' from source: unknown 8975 1727204057.93233: variable 'ansible_search_path' from source: unknown 8975 1727204057.93272: calling self._execute() 8975 1727204057.93354: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204057.93358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204057.93371: variable 'omit' from source: magic vars 8975 1727204057.93688: variable 'ansible_distribution_major_version' from source: facts 8975 1727204057.93698: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204057.93704: _execute() done 8975 1727204057.93708: dumping result to json 8975 1727204057.93710: done dumping result, returning 8975 1727204057.93720: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-9356-306d-00000000007d] 8975 1727204057.93725: sending task result for task 127b8e07-fff9-9356-306d-00000000007d 8975 1727204057.93850: done sending task result for task 127b8e07-fff9-9356-306d-00000000007d 8975 1727204057.93855: WORKER PROCESS EXITING 8975 1727204057.93907: no more pending results, returning what we have 8975 1727204057.93913: in VariableManager get_vars() 8975 1727204057.93973: Calling all_inventory to load vars for managed-node2 8975 1727204057.93977: Calling groups_inventory to load vars for managed-node2 8975 1727204057.93980: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204057.93992: Calling all_plugins_play to load vars for managed-node2 8975 1727204057.93995: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204057.93998: Calling groups_plugins_play to load vars for managed-node2 8975 1727204057.95402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204057.96936: done with get_vars() 8975 1727204057.96955: variable 'ansible_search_path' from source: unknown 8975 1727204057.96956: variable 'ansible_search_path' from source: unknown 8975 1727204057.96995: we have included files to process 8975 1727204057.96996: generating all_blocks data 8975 1727204057.96998: done generating all_blocks data 8975 1727204057.97002: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204057.97003: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204057.97004: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 8975 1727204057.97441: done processing included file 8975 1727204057.97443: iterating over new_blocks loaded from include file 8975 1727204057.97444: in VariableManager get_vars() 8975 1727204057.97464: done with get_vars() 8975 1727204057.97467: filtering new block on tags 8975 1727204057.97491: done filtering new block on tags 8975 1727204057.97494: in VariableManager get_vars() 8975 1727204057.97512: done with get_vars() 8975 1727204057.97513: filtering new block on tags 8975 1727204057.97542: done filtering new block on tags 8975 1727204057.97544: in VariableManager get_vars() 8975 1727204057.97562: done with get_vars() 8975 1727204057.97563: filtering new block on tags 8975 1727204057.97593: done filtering new block on tags 8975 1727204057.97595: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 8975 1727204057.97599: extending task lists for all hosts with included blocks 8975 1727204057.98457: done extending task lists 8975 1727204057.98459: done processing included files 8975 1727204057.98460: results queue empty 8975 1727204057.98461: checking for any_errors_fatal 8975 1727204057.98469: done checking for any_errors_fatal 8975 1727204057.98470: checking for max_fail_percentage 8975 1727204057.98472: done checking for max_fail_percentage 8975 1727204057.98473: checking to see if all hosts have failed and the running result is not ok 8975 1727204057.98474: done checking to see if all hosts have failed 8975 1727204057.98474: getting the remaining hosts for this loop 8975 1727204057.98476: done getting the remaining hosts for this loop 8975 1727204057.98479: getting the next task for host managed-node2 8975 1727204057.98484: done getting next task for host managed-node2 8975 1727204057.98487: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8975 1727204057.98491: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204057.98502: getting variables 8975 1727204057.98503: in VariableManager get_vars() 8975 1727204057.98523: Calling all_inventory to load vars for managed-node2 8975 1727204057.98525: Calling groups_inventory to load vars for managed-node2 8975 1727204057.98527: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204057.98533: Calling all_plugins_play to load vars for managed-node2 8975 1727204057.98536: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204057.98539: Calling groups_plugins_play to load vars for managed-node2 8975 1727204058.00033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204058.02174: done with get_vars() 8975 1727204058.02212: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.098) 0:00:29.340 ***** 8975 1727204058.02311: entering _queue_task() for managed-node2/setup 8975 1727204058.02708: worker is 1 (out of 1 available) 8975 1727204058.02723: exiting _queue_task() for managed-node2/setup 8975 1727204058.02736: done queuing things up, now waiting for results queue to drain 8975 1727204058.02738: waiting for pending results... 8975 1727204058.03198: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 8975 1727204058.03207: in run() - task 127b8e07-fff9-9356-306d-000000000494 8975 1727204058.03213: variable 'ansible_search_path' from source: unknown 8975 1727204058.03216: variable 'ansible_search_path' from source: unknown 8975 1727204058.03251: calling self._execute() 8975 1727204058.03372: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.03377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.03380: variable 'omit' from source: magic vars 8975 1727204058.03842: variable 'ansible_distribution_major_version' from source: facts 8975 1727204058.03847: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204058.04058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204058.06679: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204058.06762: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204058.06803: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204058.06870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204058.06874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204058.06970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204058.07001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204058.07070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204058.07077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204058.07092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204058.07155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204058.07179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204058.07206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204058.07252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204058.07325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204058.07450: variable '__network_required_facts' from source: role '' defaults 8975 1727204058.07466: variable 'ansible_facts' from source: unknown 8975 1727204058.08440: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8975 1727204058.08446: when evaluation is False, skipping this task 8975 1727204058.08449: _execute() done 8975 1727204058.08451: dumping result to json 8975 1727204058.08453: done dumping result, returning 8975 1727204058.08456: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-9356-306d-000000000494] 8975 1727204058.08458: sending task result for task 127b8e07-fff9-9356-306d-000000000494 8975 1727204058.08663: done sending task result for task 127b8e07-fff9-9356-306d-000000000494 8975 1727204058.08668: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204058.08747: no more pending results, returning what we have 8975 1727204058.08751: results queue empty 8975 1727204058.08752: checking for any_errors_fatal 8975 1727204058.08755: done checking for any_errors_fatal 8975 1727204058.08755: checking for max_fail_percentage 8975 1727204058.08757: done checking for max_fail_percentage 8975 1727204058.08758: checking to see if all hosts have failed and the running result is not ok 8975 1727204058.08759: done checking to see if all hosts have failed 8975 1727204058.08760: getting the remaining hosts for this loop 8975 1727204058.08762: done getting the remaining hosts for this loop 8975 1727204058.08769: getting the next task for host managed-node2 8975 1727204058.08781: done getting next task for host managed-node2 8975 1727204058.08786: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 8975 1727204058.08791: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204058.08813: getting variables 8975 1727204058.08815: in VariableManager get_vars() 8975 1727204058.08861: Calling all_inventory to load vars for managed-node2 8975 1727204058.08865: Calling groups_inventory to load vars for managed-node2 8975 1727204058.08972: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204058.08982: Calling all_plugins_play to load vars for managed-node2 8975 1727204058.08985: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204058.08989: Calling groups_plugins_play to load vars for managed-node2 8975 1727204058.11111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204058.13401: done with get_vars() 8975 1727204058.13440: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.112) 0:00:29.452 ***** 8975 1727204058.13569: entering _queue_task() for managed-node2/stat 8975 1727204058.13955: worker is 1 (out of 1 available) 8975 1727204058.13972: exiting _queue_task() for managed-node2/stat 8975 1727204058.13986: done queuing things up, now waiting for results queue to drain 8975 1727204058.13987: waiting for pending results... 8975 1727204058.14890: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 8975 1727204058.14940: in run() - task 127b8e07-fff9-9356-306d-000000000496 8975 1727204058.15003: variable 'ansible_search_path' from source: unknown 8975 1727204058.15200: variable 'ansible_search_path' from source: unknown 8975 1727204058.15204: calling self._execute() 8975 1727204058.15280: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.15293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.15314: variable 'omit' from source: magic vars 8975 1727204058.15716: variable 'ansible_distribution_major_version' from source: facts 8975 1727204058.15736: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204058.15925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204058.16422: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204058.16440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204058.16485: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204058.16533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204058.16695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204058.16734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204058.16768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204058.16827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204058.16909: variable '__network_is_ostree' from source: set_fact 8975 1727204058.16922: Evaluated conditional (not __network_is_ostree is defined): False 8975 1727204058.16970: when evaluation is False, skipping this task 8975 1727204058.16977: _execute() done 8975 1727204058.16979: dumping result to json 8975 1727204058.16982: done dumping result, returning 8975 1727204058.16984: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-9356-306d-000000000496] 8975 1727204058.16986: sending task result for task 127b8e07-fff9-9356-306d-000000000496 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8975 1727204058.17137: no more pending results, returning what we have 8975 1727204058.17141: results queue empty 8975 1727204058.17141: checking for any_errors_fatal 8975 1727204058.17149: done checking for any_errors_fatal 8975 1727204058.17150: checking for max_fail_percentage 8975 1727204058.17152: done checking for max_fail_percentage 8975 1727204058.17153: checking to see if all hosts have failed and the running result is not ok 8975 1727204058.17154: done checking to see if all hosts have failed 8975 1727204058.17155: getting the remaining hosts for this loop 8975 1727204058.17157: done getting the remaining hosts for this loop 8975 1727204058.17161: getting the next task for host managed-node2 8975 1727204058.17174: done getting next task for host managed-node2 8975 1727204058.17178: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8975 1727204058.17183: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204058.17203: getting variables 8975 1727204058.17205: in VariableManager get_vars() 8975 1727204058.17255: Calling all_inventory to load vars for managed-node2 8975 1727204058.17258: Calling groups_inventory to load vars for managed-node2 8975 1727204058.17261: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204058.17478: Calling all_plugins_play to load vars for managed-node2 8975 1727204058.17483: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204058.17487: Calling groups_plugins_play to load vars for managed-node2 8975 1727204058.18184: done sending task result for task 127b8e07-fff9-9356-306d-000000000496 8975 1727204058.18188: WORKER PROCESS EXITING 8975 1727204058.19504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204058.21786: done with get_vars() 8975 1727204058.21825: done getting variables 8975 1727204058.21896: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.083) 0:00:29.536 ***** 8975 1727204058.21944: entering _queue_task() for managed-node2/set_fact 8975 1727204058.22386: worker is 1 (out of 1 available) 8975 1727204058.22401: exiting _queue_task() for managed-node2/set_fact 8975 1727204058.22416: done queuing things up, now waiting for results queue to drain 8975 1727204058.22417: waiting for pending results... 8975 1727204058.22835: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 8975 1727204058.23037: in run() - task 127b8e07-fff9-9356-306d-000000000497 8975 1727204058.23068: variable 'ansible_search_path' from source: unknown 8975 1727204058.23079: variable 'ansible_search_path' from source: unknown 8975 1727204058.23124: calling self._execute() 8975 1727204058.23246: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.23263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.23280: variable 'omit' from source: magic vars 8975 1727204058.23893: variable 'ansible_distribution_major_version' from source: facts 8975 1727204058.23914: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204058.24198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204058.24537: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204058.24600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204058.24650: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204058.24753: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204058.24870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204058.24905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204058.24944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204058.24987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204058.25100: variable '__network_is_ostree' from source: set_fact 8975 1727204058.25115: Evaluated conditional (not __network_is_ostree is defined): False 8975 1727204058.25123: when evaluation is False, skipping this task 8975 1727204058.25134: _execute() done 8975 1727204058.25143: dumping result to json 8975 1727204058.25191: done dumping result, returning 8975 1727204058.25195: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-9356-306d-000000000497] 8975 1727204058.25198: sending task result for task 127b8e07-fff9-9356-306d-000000000497 8975 1727204058.25475: done sending task result for task 127b8e07-fff9-9356-306d-000000000497 8975 1727204058.25479: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 8975 1727204058.25535: no more pending results, returning what we have 8975 1727204058.25539: results queue empty 8975 1727204058.25540: checking for any_errors_fatal 8975 1727204058.25549: done checking for any_errors_fatal 8975 1727204058.25550: checking for max_fail_percentage 8975 1727204058.25552: done checking for max_fail_percentage 8975 1727204058.25553: checking to see if all hosts have failed and the running result is not ok 8975 1727204058.25555: done checking to see if all hosts have failed 8975 1727204058.25555: getting the remaining hosts for this loop 8975 1727204058.25557: done getting the remaining hosts for this loop 8975 1727204058.25562: getting the next task for host managed-node2 8975 1727204058.25574: done getting next task for host managed-node2 8975 1727204058.25578: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 8975 1727204058.25583: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204058.25603: getting variables 8975 1727204058.25605: in VariableManager get_vars() 8975 1727204058.25654: Calling all_inventory to load vars for managed-node2 8975 1727204058.25657: Calling groups_inventory to load vars for managed-node2 8975 1727204058.25659: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204058.25850: Calling all_plugins_play to load vars for managed-node2 8975 1727204058.25854: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204058.25857: Calling groups_plugins_play to load vars for managed-node2 8975 1727204058.28447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204058.30712: done with get_vars() 8975 1727204058.30761: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.089) 0:00:29.625 ***** 8975 1727204058.30889: entering _queue_task() for managed-node2/service_facts 8975 1727204058.31345: worker is 1 (out of 1 available) 8975 1727204058.31360: exiting _queue_task() for managed-node2/service_facts 8975 1727204058.31376: done queuing things up, now waiting for results queue to drain 8975 1727204058.31377: waiting for pending results... 8975 1727204058.31732: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 8975 1727204058.31866: in run() - task 127b8e07-fff9-9356-306d-000000000499 8975 1727204058.31884: variable 'ansible_search_path' from source: unknown 8975 1727204058.31887: variable 'ansible_search_path' from source: unknown 8975 1727204058.31918: calling self._execute() 8975 1727204058.32009: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.32016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.32026: variable 'omit' from source: magic vars 8975 1727204058.32341: variable 'ansible_distribution_major_version' from source: facts 8975 1727204058.32352: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204058.32359: variable 'omit' from source: magic vars 8975 1727204058.32428: variable 'omit' from source: magic vars 8975 1727204058.32456: variable 'omit' from source: magic vars 8975 1727204058.32497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204058.32528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204058.32551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204058.32568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204058.32579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204058.32605: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204058.32610: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.32613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.32695: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204058.32698: Set connection var ansible_connection to ssh 8975 1727204058.32701: Set connection var ansible_shell_executable to /bin/sh 8975 1727204058.32707: Set connection var ansible_timeout to 10 8975 1727204058.32710: Set connection var ansible_shell_type to sh 8975 1727204058.32720: Set connection var ansible_pipelining to False 8975 1727204058.32744: variable 'ansible_shell_executable' from source: unknown 8975 1727204058.32749: variable 'ansible_connection' from source: unknown 8975 1727204058.32752: variable 'ansible_module_compression' from source: unknown 8975 1727204058.32754: variable 'ansible_shell_type' from source: unknown 8975 1727204058.32757: variable 'ansible_shell_executable' from source: unknown 8975 1727204058.32760: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204058.32762: variable 'ansible_pipelining' from source: unknown 8975 1727204058.32764: variable 'ansible_timeout' from source: unknown 8975 1727204058.32772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204058.32940: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204058.32952: variable 'omit' from source: magic vars 8975 1727204058.32956: starting attempt loop 8975 1727204058.32958: running the handler 8975 1727204058.32974: _low_level_execute_command(): starting 8975 1727204058.32987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204058.33551: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204058.33559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204058.33563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.33614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204058.33638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204058.33736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204058.35504: stdout chunk (state=3): >>>/root <<< 8975 1727204058.35615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204058.35688: stderr chunk (state=3): >>><<< 8975 1727204058.35691: stdout chunk (state=3): >>><<< 8975 1727204058.35712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204058.35774: _low_level_execute_command(): starting 8975 1727204058.35779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799 `" && echo ansible-tmp-1727204058.357178-11470-62603372993799="` echo /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799 `" ) && sleep 0' 8975 1727204058.36246: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204058.36249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.36252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204058.36263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204058.36268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.36307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204058.36310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204058.36317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204058.36389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204058.38374: stdout chunk (state=3): >>>ansible-tmp-1727204058.357178-11470-62603372993799=/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799 <<< 8975 1727204058.38485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204058.38548: stderr chunk (state=3): >>><<< 8975 1727204058.38552: stdout chunk (state=3): >>><<< 8975 1727204058.38574: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204058.357178-11470-62603372993799=/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204058.38617: variable 'ansible_module_compression' from source: unknown 8975 1727204058.38657: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 8975 1727204058.38694: variable 'ansible_facts' from source: unknown 8975 1727204058.38754: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py 8975 1727204058.38870: Sending initial data 8975 1727204058.38883: Sent initial data (159 bytes) 8975 1727204058.39385: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204058.39389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.39392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204058.39394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.39457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204058.39467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204058.39470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204058.39541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204058.41139: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204058.41204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204058.41274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpvioshms5 /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py <<< 8975 1727204058.41280: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py" <<< 8975 1727204058.41344: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpvioshms5" to remote "/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py" <<< 8975 1727204058.41347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py" <<< 8975 1727204058.42054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204058.42136: stderr chunk (state=3): >>><<< 8975 1727204058.42140: stdout chunk (state=3): >>><<< 8975 1727204058.42157: done transferring module to remote 8975 1727204058.42170: _low_level_execute_command(): starting 8975 1727204058.42177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/ /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py && sleep 0' 8975 1727204058.42681: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204058.42685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.42687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204058.42690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204058.42696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.42744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204058.42749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204058.42821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204058.44635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204058.44693: stderr chunk (state=3): >>><<< 8975 1727204058.44697: stdout chunk (state=3): >>><<< 8975 1727204058.44712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204058.44715: _low_level_execute_command(): starting 8975 1727204058.44720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/AnsiballZ_service_facts.py && sleep 0' 8975 1727204058.45213: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204058.45217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.45220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204058.45224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204058.45226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204058.45279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204058.45283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204058.45362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.64045: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"<<< 8975 1727204060.64086: stdout chunk (state=3): >>>}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 8975 1727204060.65731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204060.65736: stdout chunk (state=3): >>><<< 8975 1727204060.65739: stderr chunk (state=3): >>><<< 8975 1727204060.65874: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204060.67190: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204060.67211: _low_level_execute_command(): starting 8975 1727204060.67222: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204058.357178-11470-62603372993799/ > /dev/null 2>&1 && sleep 0' 8975 1727204060.67938: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204060.68014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204060.68032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204060.68103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204060.68142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.68148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.68258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.70373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204060.70377: stderr chunk (state=3): >>><<< 8975 1727204060.70380: stdout chunk (state=3): >>><<< 8975 1727204060.70382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204060.70385: handler run complete 8975 1727204060.70548: variable 'ansible_facts' from source: unknown 8975 1727204060.70762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204060.71414: variable 'ansible_facts' from source: unknown 8975 1727204060.71588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204060.71864: attempt loop complete, returning result 8975 1727204060.71870: _execute() done 8975 1727204060.71873: dumping result to json 8975 1727204060.71950: done dumping result, returning 8975 1727204060.71961: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-9356-306d-000000000499] 8975 1727204060.71969: sending task result for task 127b8e07-fff9-9356-306d-000000000499 8975 1727204060.73730: done sending task result for task 127b8e07-fff9-9356-306d-000000000499 8975 1727204060.73734: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204060.73868: no more pending results, returning what we have 8975 1727204060.73872: results queue empty 8975 1727204060.73873: checking for any_errors_fatal 8975 1727204060.73878: done checking for any_errors_fatal 8975 1727204060.73879: checking for max_fail_percentage 8975 1727204060.73881: done checking for max_fail_percentage 8975 1727204060.73882: checking to see if all hosts have failed and the running result is not ok 8975 1727204060.73883: done checking to see if all hosts have failed 8975 1727204060.73884: getting the remaining hosts for this loop 8975 1727204060.73885: done getting the remaining hosts for this loop 8975 1727204060.73900: getting the next task for host managed-node2 8975 1727204060.73908: done getting next task for host managed-node2 8975 1727204060.73918: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 8975 1727204060.73925: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204060.73944: getting variables 8975 1727204060.73946: in VariableManager get_vars() 8975 1727204060.74007: Calling all_inventory to load vars for managed-node2 8975 1727204060.74012: Calling groups_inventory to load vars for managed-node2 8975 1727204060.74014: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204060.74025: Calling all_plugins_play to load vars for managed-node2 8975 1727204060.74028: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204060.74032: Calling groups_plugins_play to load vars for managed-node2 8975 1727204060.75956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204060.78303: done with get_vars() 8975 1727204060.78334: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:20 -0400 (0:00:02.475) 0:00:32.101 ***** 8975 1727204060.78453: entering _queue_task() for managed-node2/package_facts 8975 1727204060.78840: worker is 1 (out of 1 available) 8975 1727204060.78854: exiting _queue_task() for managed-node2/package_facts 8975 1727204060.79073: done queuing things up, now waiting for results queue to drain 8975 1727204060.79075: waiting for pending results... 8975 1727204060.79285: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 8975 1727204060.79393: in run() - task 127b8e07-fff9-9356-306d-00000000049a 8975 1727204060.79420: variable 'ansible_search_path' from source: unknown 8975 1727204060.79429: variable 'ansible_search_path' from source: unknown 8975 1727204060.79474: calling self._execute() 8975 1727204060.79681: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204060.79696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204060.79721: variable 'omit' from source: magic vars 8975 1727204060.80262: variable 'ansible_distribution_major_version' from source: facts 8975 1727204060.80288: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204060.80471: variable 'omit' from source: magic vars 8975 1727204060.80475: variable 'omit' from source: magic vars 8975 1727204060.80478: variable 'omit' from source: magic vars 8975 1727204060.80516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204060.80562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204060.80600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204060.80626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204060.80645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204060.80687: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204060.80703: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204060.80714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204060.81164: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204060.81171: Set connection var ansible_connection to ssh 8975 1727204060.81175: Set connection var ansible_shell_executable to /bin/sh 8975 1727204060.81178: Set connection var ansible_timeout to 10 8975 1727204060.81181: Set connection var ansible_shell_type to sh 8975 1727204060.81183: Set connection var ansible_pipelining to False 8975 1727204060.81185: variable 'ansible_shell_executable' from source: unknown 8975 1727204060.81187: variable 'ansible_connection' from source: unknown 8975 1727204060.81190: variable 'ansible_module_compression' from source: unknown 8975 1727204060.81192: variable 'ansible_shell_type' from source: unknown 8975 1727204060.81194: variable 'ansible_shell_executable' from source: unknown 8975 1727204060.81196: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204060.81199: variable 'ansible_pipelining' from source: unknown 8975 1727204060.81201: variable 'ansible_timeout' from source: unknown 8975 1727204060.81203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204060.81373: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204060.81397: variable 'omit' from source: magic vars 8975 1727204060.81406: starting attempt loop 8975 1727204060.81413: running the handler 8975 1727204060.81431: _low_level_execute_command(): starting 8975 1727204060.81443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204060.82231: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204060.82265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204060.82379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204060.82396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.82419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.82525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.84211: stdout chunk (state=3): >>>/root <<< 8975 1727204060.84393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204060.84414: stdout chunk (state=3): >>><<< 8975 1727204060.84428: stderr chunk (state=3): >>><<< 8975 1727204060.84571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204060.84575: _low_level_execute_command(): starting 8975 1727204060.84579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088 `" && echo ansible-tmp-1727204060.8445811-11558-168093221892088="` echo /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088 `" ) && sleep 0' 8975 1727204060.85137: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204060.85256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.85290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.85398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.87387: stdout chunk (state=3): >>>ansible-tmp-1727204060.8445811-11558-168093221892088=/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088 <<< 8975 1727204060.87609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204060.87615: stdout chunk (state=3): >>><<< 8975 1727204060.87618: stderr chunk (state=3): >>><<< 8975 1727204060.87772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204060.8445811-11558-168093221892088=/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204060.87778: variable 'ansible_module_compression' from source: unknown 8975 1727204060.87781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 8975 1727204060.87846: variable 'ansible_facts' from source: unknown 8975 1727204060.88030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py 8975 1727204060.88256: Sending initial data 8975 1727204060.88259: Sent initial data (161 bytes) 8975 1727204060.88975: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204060.89037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204060.89040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.89045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.89120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.90798: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8975 1727204060.90803: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204060.90876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204060.90958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpbmvre4q1 /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py <<< 8975 1727204060.90961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py" <<< 8975 1727204060.91040: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpbmvre4q1" to remote "/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py" <<< 8975 1727204060.92878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204060.92883: stdout chunk (state=3): >>><<< 8975 1727204060.92885: stderr chunk (state=3): >>><<< 8975 1727204060.92887: done transferring module to remote 8975 1727204060.92890: _low_level_execute_command(): starting 8975 1727204060.92892: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/ /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py && sleep 0' 8975 1727204060.93573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204060.93652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204060.93710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204060.93733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.93768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.93870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204060.95911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204060.95915: stderr chunk (state=3): >>><<< 8975 1727204060.95918: stdout chunk (state=3): >>><<< 8975 1727204060.95921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204060.95932: _low_level_execute_command(): starting 8975 1727204060.95935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/AnsiballZ_package_facts.py && sleep 0' 8975 1727204060.96601: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204060.96608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204060.96617: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204060.96664: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204060.96696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204060.96712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204060.96723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204060.96803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204061.59450: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 8975 1727204061.59473: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 8975 1727204061.59495: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 8975 1727204061.59521: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 8975 1727204061.59539: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 8975 1727204061.59564: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 8975 1727204061.59578: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 8975 1727204061.59623: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 8975 1727204061.59632: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 8975 1727204061.59659: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 8975 1727204061.59690: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 8975 1727204061.59724: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 8975 1727204061.59740: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 8975 1727204061.61576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204061.61637: stderr chunk (state=3): >>><<< 8975 1727204061.61641: stdout chunk (state=3): >>><<< 8975 1727204061.61692: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204061.65041: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204061.65353: _low_level_execute_command(): starting 8975 1727204061.65357: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204060.8445811-11558-168093221892088/ > /dev/null 2>&1 && sleep 0' 8975 1727204061.66089: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204061.66096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204061.66107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204061.66123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204061.66136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204061.66144: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204061.66154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204061.66171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204061.66253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204061.66274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204061.66377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204061.68425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204061.68431: stdout chunk (state=3): >>><<< 8975 1727204061.68434: stderr chunk (state=3): >>><<< 8975 1727204061.68576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204061.68580: handler run complete 8975 1727204061.69781: variable 'ansible_facts' from source: unknown 8975 1727204061.70491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204061.73408: variable 'ansible_facts' from source: unknown 8975 1727204061.74146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204061.75148: attempt loop complete, returning result 8975 1727204061.75171: _execute() done 8975 1727204061.75598: dumping result to json 8975 1727204061.76097: done dumping result, returning 8975 1727204061.76103: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-9356-306d-00000000049a] 8975 1727204061.76111: sending task result for task 127b8e07-fff9-9356-306d-00000000049a 8975 1727204061.78516: done sending task result for task 127b8e07-fff9-9356-306d-00000000049a 8975 1727204061.78520: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204061.78617: no more pending results, returning what we have 8975 1727204061.78620: results queue empty 8975 1727204061.78620: checking for any_errors_fatal 8975 1727204061.78624: done checking for any_errors_fatal 8975 1727204061.78625: checking for max_fail_percentage 8975 1727204061.78626: done checking for max_fail_percentage 8975 1727204061.78627: checking to see if all hosts have failed and the running result is not ok 8975 1727204061.78629: done checking to see if all hosts have failed 8975 1727204061.78630: getting the remaining hosts for this loop 8975 1727204061.78632: done getting the remaining hosts for this loop 8975 1727204061.78636: getting the next task for host managed-node2 8975 1727204061.78642: done getting next task for host managed-node2 8975 1727204061.78645: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 8975 1727204061.78648: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204061.78656: getting variables 8975 1727204061.78657: in VariableManager get_vars() 8975 1727204061.78686: Calling all_inventory to load vars for managed-node2 8975 1727204061.78688: Calling groups_inventory to load vars for managed-node2 8975 1727204061.78689: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204061.78696: Calling all_plugins_play to load vars for managed-node2 8975 1727204061.78698: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204061.78700: Calling groups_plugins_play to load vars for managed-node2 8975 1727204061.81393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204061.84748: done with get_vars() 8975 1727204061.84880: done getting variables 8975 1727204061.85037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:21 -0400 (0:00:01.066) 0:00:33.167 ***** 8975 1727204061.85086: entering _queue_task() for managed-node2/debug 8975 1727204061.85984: worker is 1 (out of 1 available) 8975 1727204061.86000: exiting _queue_task() for managed-node2/debug 8975 1727204061.86016: done queuing things up, now waiting for results queue to drain 8975 1727204061.86018: waiting for pending results... 8975 1727204061.86592: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 8975 1727204061.86816: in run() - task 127b8e07-fff9-9356-306d-00000000007e 8975 1727204061.86840: variable 'ansible_search_path' from source: unknown 8975 1727204061.86845: variable 'ansible_search_path' from source: unknown 8975 1727204061.86915: calling self._execute() 8975 1727204061.87285: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204061.87352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204061.87356: variable 'omit' from source: magic vars 8975 1727204061.88801: variable 'ansible_distribution_major_version' from source: facts 8975 1727204061.88806: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204061.88810: variable 'omit' from source: magic vars 8975 1727204061.88812: variable 'omit' from source: magic vars 8975 1727204061.88904: variable 'network_provider' from source: set_fact 8975 1727204061.88932: variable 'omit' from source: magic vars 8975 1727204061.88986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204061.89032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204061.89058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204061.89085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204061.89102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204061.89140: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204061.89149: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204061.89157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204061.89273: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204061.89282: Set connection var ansible_connection to ssh 8975 1727204061.89293: Set connection var ansible_shell_executable to /bin/sh 8975 1727204061.89303: Set connection var ansible_timeout to 10 8975 1727204061.89309: Set connection var ansible_shell_type to sh 8975 1727204061.89326: Set connection var ansible_pipelining to False 8975 1727204061.89355: variable 'ansible_shell_executable' from source: unknown 8975 1727204061.89382: variable 'ansible_connection' from source: unknown 8975 1727204061.89385: variable 'ansible_module_compression' from source: unknown 8975 1727204061.89387: variable 'ansible_shell_type' from source: unknown 8975 1727204061.89389: variable 'ansible_shell_executable' from source: unknown 8975 1727204061.89391: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204061.89471: variable 'ansible_pipelining' from source: unknown 8975 1727204061.89474: variable 'ansible_timeout' from source: unknown 8975 1727204061.89477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204061.89569: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204061.89588: variable 'omit' from source: magic vars 8975 1727204061.89602: starting attempt loop 8975 1727204061.89609: running the handler 8975 1727204061.89662: handler run complete 8975 1727204061.89685: attempt loop complete, returning result 8975 1727204061.89692: _execute() done 8975 1727204061.89699: dumping result to json 8975 1727204061.89710: done dumping result, returning 8975 1727204061.89721: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-9356-306d-00000000007e] 8975 1727204061.89733: sending task result for task 127b8e07-fff9-9356-306d-00000000007e 8975 1727204061.90006: done sending task result for task 127b8e07-fff9-9356-306d-00000000007e 8975 1727204061.90009: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 8975 1727204061.90084: no more pending results, returning what we have 8975 1727204061.90088: results queue empty 8975 1727204061.90089: checking for any_errors_fatal 8975 1727204061.90097: done checking for any_errors_fatal 8975 1727204061.90098: checking for max_fail_percentage 8975 1727204061.90100: done checking for max_fail_percentage 8975 1727204061.90101: checking to see if all hosts have failed and the running result is not ok 8975 1727204061.90102: done checking to see if all hosts have failed 8975 1727204061.90103: getting the remaining hosts for this loop 8975 1727204061.90105: done getting the remaining hosts for this loop 8975 1727204061.90109: getting the next task for host managed-node2 8975 1727204061.90119: done getting next task for host managed-node2 8975 1727204061.90123: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8975 1727204061.90128: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204061.90141: getting variables 8975 1727204061.90142: in VariableManager get_vars() 8975 1727204061.90191: Calling all_inventory to load vars for managed-node2 8975 1727204061.90194: Calling groups_inventory to load vars for managed-node2 8975 1727204061.90196: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204061.90207: Calling all_plugins_play to load vars for managed-node2 8975 1727204061.90211: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204061.90214: Calling groups_plugins_play to load vars for managed-node2 8975 1727204061.92111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204061.94267: done with get_vars() 8975 1727204061.94306: done getting variables 8975 1727204061.94375: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.093) 0:00:33.261 ***** 8975 1727204061.94417: entering _queue_task() for managed-node2/fail 8975 1727204061.94802: worker is 1 (out of 1 available) 8975 1727204061.94818: exiting _queue_task() for managed-node2/fail 8975 1727204061.94832: done queuing things up, now waiting for results queue to drain 8975 1727204061.94834: waiting for pending results... 8975 1727204061.95142: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 8975 1727204061.95473: in run() - task 127b8e07-fff9-9356-306d-00000000007f 8975 1727204061.95477: variable 'ansible_search_path' from source: unknown 8975 1727204061.95480: variable 'ansible_search_path' from source: unknown 8975 1727204061.95484: calling self._execute() 8975 1727204061.95496: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204061.95508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204061.95521: variable 'omit' from source: magic vars 8975 1727204061.95935: variable 'ansible_distribution_major_version' from source: facts 8975 1727204061.95952: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204061.96106: variable 'network_state' from source: role '' defaults 8975 1727204061.96124: Evaluated conditional (network_state != {}): False 8975 1727204061.96132: when evaluation is False, skipping this task 8975 1727204061.96146: _execute() done 8975 1727204061.96154: dumping result to json 8975 1727204061.96162: done dumping result, returning 8975 1727204061.96177: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-9356-306d-00000000007f] 8975 1727204061.96190: sending task result for task 127b8e07-fff9-9356-306d-00000000007f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204061.96353: no more pending results, returning what we have 8975 1727204061.96357: results queue empty 8975 1727204061.96358: checking for any_errors_fatal 8975 1727204061.96368: done checking for any_errors_fatal 8975 1727204061.96370: checking for max_fail_percentage 8975 1727204061.96372: done checking for max_fail_percentage 8975 1727204061.96373: checking to see if all hosts have failed and the running result is not ok 8975 1727204061.96374: done checking to see if all hosts have failed 8975 1727204061.96375: getting the remaining hosts for this loop 8975 1727204061.96377: done getting the remaining hosts for this loop 8975 1727204061.96382: getting the next task for host managed-node2 8975 1727204061.96392: done getting next task for host managed-node2 8975 1727204061.96396: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8975 1727204061.96401: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204061.96427: getting variables 8975 1727204061.96429: in VariableManager get_vars() 8975 1727204061.96780: Calling all_inventory to load vars for managed-node2 8975 1727204061.96784: Calling groups_inventory to load vars for managed-node2 8975 1727204061.96786: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204061.96796: Calling all_plugins_play to load vars for managed-node2 8975 1727204061.96799: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204061.96803: Calling groups_plugins_play to load vars for managed-node2 8975 1727204061.97486: done sending task result for task 127b8e07-fff9-9356-306d-00000000007f 8975 1727204061.97490: WORKER PROCESS EXITING 8975 1727204061.99338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.02623: done with get_vars() 8975 1727204062.02663: done getting variables 8975 1727204062.02735: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.083) 0:00:33.344 ***** 8975 1727204062.02776: entering _queue_task() for managed-node2/fail 8975 1727204062.03155: worker is 1 (out of 1 available) 8975 1727204062.03171: exiting _queue_task() for managed-node2/fail 8975 1727204062.03185: done queuing things up, now waiting for results queue to drain 8975 1727204062.03186: waiting for pending results... 8975 1727204062.03501: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 8975 1727204062.03685: in run() - task 127b8e07-fff9-9356-306d-000000000080 8975 1727204062.03707: variable 'ansible_search_path' from source: unknown 8975 1727204062.03715: variable 'ansible_search_path' from source: unknown 8975 1727204062.03764: calling self._execute() 8975 1727204062.03883: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.03896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.03911: variable 'omit' from source: magic vars 8975 1727204062.04338: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.04357: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.04502: variable 'network_state' from source: role '' defaults 8975 1727204062.04519: Evaluated conditional (network_state != {}): False 8975 1727204062.04526: when evaluation is False, skipping this task 8975 1727204062.04533: _execute() done 8975 1727204062.04541: dumping result to json 8975 1727204062.04547: done dumping result, returning 8975 1727204062.04560: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-9356-306d-000000000080] 8975 1727204062.04574: sending task result for task 127b8e07-fff9-9356-306d-000000000080 8975 1727204062.04699: done sending task result for task 127b8e07-fff9-9356-306d-000000000080 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204062.04756: no more pending results, returning what we have 8975 1727204062.04761: results queue empty 8975 1727204062.04762: checking for any_errors_fatal 8975 1727204062.04774: done checking for any_errors_fatal 8975 1727204062.04776: checking for max_fail_percentage 8975 1727204062.04778: done checking for max_fail_percentage 8975 1727204062.04779: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.04780: done checking to see if all hosts have failed 8975 1727204062.04781: getting the remaining hosts for this loop 8975 1727204062.04783: done getting the remaining hosts for this loop 8975 1727204062.04788: getting the next task for host managed-node2 8975 1727204062.04797: done getting next task for host managed-node2 8975 1727204062.04801: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8975 1727204062.04805: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.04829: getting variables 8975 1727204062.04831: in VariableManager get_vars() 8975 1727204062.04983: Calling all_inventory to load vars for managed-node2 8975 1727204062.04987: Calling groups_inventory to load vars for managed-node2 8975 1727204062.04989: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.05177: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.05181: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.05184: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.05884: WORKER PROCESS EXITING 8975 1727204062.07039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.09195: done with get_vars() 8975 1727204062.09236: done getting variables 8975 1727204062.09304: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.065) 0:00:33.410 ***** 8975 1727204062.09343: entering _queue_task() for managed-node2/fail 8975 1727204062.09725: worker is 1 (out of 1 available) 8975 1727204062.09739: exiting _queue_task() for managed-node2/fail 8975 1727204062.09754: done queuing things up, now waiting for results queue to drain 8975 1727204062.09756: waiting for pending results... 8975 1727204062.10082: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 8975 1727204062.10260: in run() - task 127b8e07-fff9-9356-306d-000000000081 8975 1727204062.10286: variable 'ansible_search_path' from source: unknown 8975 1727204062.10295: variable 'ansible_search_path' from source: unknown 8975 1727204062.10344: calling self._execute() 8975 1727204062.10459: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.10475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.10490: variable 'omit' from source: magic vars 8975 1727204062.10913: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.10932: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.11135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204062.15609: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204062.15698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204062.15745: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204062.15789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204062.15828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204062.15928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.15967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.15999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.16053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.16076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.16188: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.16211: Evaluated conditional (ansible_distribution_major_version | int > 9): True 8975 1727204062.16350: variable 'ansible_distribution' from source: facts 8975 1727204062.16360: variable '__network_rh_distros' from source: role '' defaults 8975 1727204062.16377: Evaluated conditional (ansible_distribution in __network_rh_distros): False 8975 1727204062.16385: when evaluation is False, skipping this task 8975 1727204062.16391: _execute() done 8975 1727204062.16399: dumping result to json 8975 1727204062.16406: done dumping result, returning 8975 1727204062.16452: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-9356-306d-000000000081] 8975 1727204062.16456: sending task result for task 127b8e07-fff9-9356-306d-000000000081 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 8975 1727204062.16605: no more pending results, returning what we have 8975 1727204062.16610: results queue empty 8975 1727204062.16611: checking for any_errors_fatal 8975 1727204062.16620: done checking for any_errors_fatal 8975 1727204062.16621: checking for max_fail_percentage 8975 1727204062.16623: done checking for max_fail_percentage 8975 1727204062.16625: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.16626: done checking to see if all hosts have failed 8975 1727204062.16627: getting the remaining hosts for this loop 8975 1727204062.16629: done getting the remaining hosts for this loop 8975 1727204062.16634: getting the next task for host managed-node2 8975 1727204062.16643: done getting next task for host managed-node2 8975 1727204062.16648: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8975 1727204062.16652: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.16673: getting variables 8975 1727204062.16675: in VariableManager get_vars() 8975 1727204062.16723: Calling all_inventory to load vars for managed-node2 8975 1727204062.16726: Calling groups_inventory to load vars for managed-node2 8975 1727204062.16728: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.16741: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.16745: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.16749: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.17385: done sending task result for task 127b8e07-fff9-9356-306d-000000000081 8975 1727204062.17390: WORKER PROCESS EXITING 8975 1727204062.18770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.20926: done with get_vars() 8975 1727204062.20967: done getting variables 8975 1727204062.21034: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.117) 0:00:33.527 ***** 8975 1727204062.21074: entering _queue_task() for managed-node2/dnf 8975 1727204062.21446: worker is 1 (out of 1 available) 8975 1727204062.21460: exiting _queue_task() for managed-node2/dnf 8975 1727204062.21475: done queuing things up, now waiting for results queue to drain 8975 1727204062.21477: waiting for pending results... 8975 1727204062.21780: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 8975 1727204062.21960: in run() - task 127b8e07-fff9-9356-306d-000000000082 8975 1727204062.21983: variable 'ansible_search_path' from source: unknown 8975 1727204062.21991: variable 'ansible_search_path' from source: unknown 8975 1727204062.22040: calling self._execute() 8975 1727204062.22149: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.22161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.22178: variable 'omit' from source: magic vars 8975 1727204062.22585: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.22606: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.22873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204062.27066: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204062.27375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204062.27380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204062.27382: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204062.27385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204062.27564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.27733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.27771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.27827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.27849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.27989: variable 'ansible_distribution' from source: facts 8975 1727204062.28000: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.28012: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 8975 1727204062.28149: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204062.28301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.28330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.28370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.28419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.28439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.28493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.28522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.28551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.28601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.28618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.28668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.28700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.28729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.28776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.28801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.28985: variable 'network_connections' from source: task vars 8975 1727204062.29009: variable 'port2_profile' from source: play vars 8975 1727204062.29089: variable 'port2_profile' from source: play vars 8975 1727204062.29108: variable 'port1_profile' from source: play vars 8975 1727204062.29176: variable 'port1_profile' from source: play vars 8975 1727204062.29189: variable 'controller_profile' from source: play vars 8975 1727204062.29326: variable 'controller_profile' from source: play vars 8975 1727204062.29349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204062.29576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204062.29625: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204062.29668: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204062.29703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204062.29758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204062.29800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204062.29832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.29864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204062.29970: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204062.30207: variable 'network_connections' from source: task vars 8975 1727204062.30219: variable 'port2_profile' from source: play vars 8975 1727204062.30290: variable 'port2_profile' from source: play vars 8975 1727204062.30305: variable 'port1_profile' from source: play vars 8975 1727204062.30378: variable 'port1_profile' from source: play vars 8975 1727204062.30393: variable 'controller_profile' from source: play vars 8975 1727204062.30462: variable 'controller_profile' from source: play vars 8975 1727204062.30534: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204062.30537: when evaluation is False, skipping this task 8975 1727204062.30540: _execute() done 8975 1727204062.30542: dumping result to json 8975 1727204062.30544: done dumping result, returning 8975 1727204062.30546: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-9356-306d-000000000082] 8975 1727204062.30549: sending task result for task 127b8e07-fff9-9356-306d-000000000082 8975 1727204062.30902: done sending task result for task 127b8e07-fff9-9356-306d-000000000082 8975 1727204062.30906: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204062.30953: no more pending results, returning what we have 8975 1727204062.30956: results queue empty 8975 1727204062.30957: checking for any_errors_fatal 8975 1727204062.30962: done checking for any_errors_fatal 8975 1727204062.30963: checking for max_fail_percentage 8975 1727204062.30965: done checking for max_fail_percentage 8975 1727204062.30967: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.30968: done checking to see if all hosts have failed 8975 1727204062.30969: getting the remaining hosts for this loop 8975 1727204062.30971: done getting the remaining hosts for this loop 8975 1727204062.30975: getting the next task for host managed-node2 8975 1727204062.30982: done getting next task for host managed-node2 8975 1727204062.30986: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8975 1727204062.30990: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.31007: getting variables 8975 1727204062.31009: in VariableManager get_vars() 8975 1727204062.31054: Calling all_inventory to load vars for managed-node2 8975 1727204062.31058: Calling groups_inventory to load vars for managed-node2 8975 1727204062.31060: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.31073: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.31077: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.31080: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.33019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.35139: done with get_vars() 8975 1727204062.35178: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 8975 1727204062.35263: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.142) 0:00:33.669 ***** 8975 1727204062.35304: entering _queue_task() for managed-node2/yum 8975 1727204062.35787: worker is 1 (out of 1 available) 8975 1727204062.35800: exiting _queue_task() for managed-node2/yum 8975 1727204062.35813: done queuing things up, now waiting for results queue to drain 8975 1727204062.35814: waiting for pending results... 8975 1727204062.36035: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 8975 1727204062.36211: in run() - task 127b8e07-fff9-9356-306d-000000000083 8975 1727204062.36235: variable 'ansible_search_path' from source: unknown 8975 1727204062.36244: variable 'ansible_search_path' from source: unknown 8975 1727204062.36294: calling self._execute() 8975 1727204062.36413: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.36427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.36572: variable 'omit' from source: magic vars 8975 1727204062.36859: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.36880: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.37092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204062.41381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204062.41468: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204062.41518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204062.41563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204062.41596: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204062.41696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.41736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.41770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.41819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.41952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.41955: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.41973: Evaluated conditional (ansible_distribution_major_version | int < 8): False 8975 1727204062.41981: when evaluation is False, skipping this task 8975 1727204062.41988: _execute() done 8975 1727204062.41996: dumping result to json 8975 1727204062.42003: done dumping result, returning 8975 1727204062.42017: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-9356-306d-000000000083] 8975 1727204062.42029: sending task result for task 127b8e07-fff9-9356-306d-000000000083 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 8975 1727204062.42219: no more pending results, returning what we have 8975 1727204062.42223: results queue empty 8975 1727204062.42224: checking for any_errors_fatal 8975 1727204062.42231: done checking for any_errors_fatal 8975 1727204062.42232: checking for max_fail_percentage 8975 1727204062.42234: done checking for max_fail_percentage 8975 1727204062.42235: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.42237: done checking to see if all hosts have failed 8975 1727204062.42237: getting the remaining hosts for this loop 8975 1727204062.42240: done getting the remaining hosts for this loop 8975 1727204062.42244: getting the next task for host managed-node2 8975 1727204062.42254: done getting next task for host managed-node2 8975 1727204062.42258: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8975 1727204062.42261: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.42287: getting variables 8975 1727204062.42289: in VariableManager get_vars() 8975 1727204062.42334: Calling all_inventory to load vars for managed-node2 8975 1727204062.42338: Calling groups_inventory to load vars for managed-node2 8975 1727204062.42340: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.42352: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.42355: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.42359: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.42475: done sending task result for task 127b8e07-fff9-9356-306d-000000000083 8975 1727204062.42478: WORKER PROCESS EXITING 8975 1727204062.44590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.53194: done with get_vars() 8975 1727204062.53236: done getting variables 8975 1727204062.53297: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.180) 0:00:33.850 ***** 8975 1727204062.53334: entering _queue_task() for managed-node2/fail 8975 1727204062.53816: worker is 1 (out of 1 available) 8975 1727204062.53829: exiting _queue_task() for managed-node2/fail 8975 1727204062.53841: done queuing things up, now waiting for results queue to drain 8975 1727204062.53843: waiting for pending results... 8975 1727204062.54083: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 8975 1727204062.54284: in run() - task 127b8e07-fff9-9356-306d-000000000084 8975 1727204062.54312: variable 'ansible_search_path' from source: unknown 8975 1727204062.54319: variable 'ansible_search_path' from source: unknown 8975 1727204062.54362: calling self._execute() 8975 1727204062.54484: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.54498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.54518: variable 'omit' from source: magic vars 8975 1727204062.55072: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.55076: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.55573: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204062.55723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204062.60140: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204062.60236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204062.60292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204062.60342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204062.60386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204062.60496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.60539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.60579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.60639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.60661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.60727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.60760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.60805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.60851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.60913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.60935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.60968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.60999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.61054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.61077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.61346: variable 'network_connections' from source: task vars 8975 1727204062.61350: variable 'port2_profile' from source: play vars 8975 1727204062.61396: variable 'port2_profile' from source: play vars 8975 1727204062.61413: variable 'port1_profile' from source: play vars 8975 1727204062.61491: variable 'port1_profile' from source: play vars 8975 1727204062.61505: variable 'controller_profile' from source: play vars 8975 1727204062.61579: variable 'controller_profile' from source: play vars 8975 1727204062.61672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204062.62050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204062.62108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204062.62148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204062.62186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204062.62247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204062.62543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204062.62547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.62549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204062.62552: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204062.63199: variable 'network_connections' from source: task vars 8975 1727204062.63348: variable 'port2_profile' from source: play vars 8975 1727204062.63424: variable 'port2_profile' from source: play vars 8975 1727204062.63455: variable 'port1_profile' from source: play vars 8975 1727204062.63621: variable 'port1_profile' from source: play vars 8975 1727204062.63658: variable 'controller_profile' from source: play vars 8975 1727204062.63841: variable 'controller_profile' from source: play vars 8975 1727204062.63844: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204062.63856: when evaluation is False, skipping this task 8975 1727204062.63859: _execute() done 8975 1727204062.63931: dumping result to json 8975 1727204062.63944: done dumping result, returning 8975 1727204062.64006: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-9356-306d-000000000084] 8975 1727204062.64020: sending task result for task 127b8e07-fff9-9356-306d-000000000084 8975 1727204062.64232: done sending task result for task 127b8e07-fff9-9356-306d-000000000084 8975 1727204062.64235: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204062.64314: no more pending results, returning what we have 8975 1727204062.64318: results queue empty 8975 1727204062.64319: checking for any_errors_fatal 8975 1727204062.64326: done checking for any_errors_fatal 8975 1727204062.64329: checking for max_fail_percentage 8975 1727204062.64331: done checking for max_fail_percentage 8975 1727204062.64332: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.64340: done checking to see if all hosts have failed 8975 1727204062.64341: getting the remaining hosts for this loop 8975 1727204062.64344: done getting the remaining hosts for this loop 8975 1727204062.64349: getting the next task for host managed-node2 8975 1727204062.64359: done getting next task for host managed-node2 8975 1727204062.64364: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 8975 1727204062.64370: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.64392: getting variables 8975 1727204062.64394: in VariableManager get_vars() 8975 1727204062.64446: Calling all_inventory to load vars for managed-node2 8975 1727204062.64450: Calling groups_inventory to load vars for managed-node2 8975 1727204062.64452: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.64464: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.64672: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.64677: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.66469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.68898: done with get_vars() 8975 1727204062.68937: done getting variables 8975 1727204062.69248: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.159) 0:00:34.009 ***** 8975 1727204062.69298: entering _queue_task() for managed-node2/package 8975 1727204062.69916: worker is 1 (out of 1 available) 8975 1727204062.69934: exiting _queue_task() for managed-node2/package 8975 1727204062.69950: done queuing things up, now waiting for results queue to drain 8975 1727204062.69952: waiting for pending results... 8975 1727204062.70429: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 8975 1727204062.70854: in run() - task 127b8e07-fff9-9356-306d-000000000085 8975 1727204062.70859: variable 'ansible_search_path' from source: unknown 8975 1727204062.70862: variable 'ansible_search_path' from source: unknown 8975 1727204062.70962: calling self._execute() 8975 1727204062.71013: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.71025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.71045: variable 'omit' from source: magic vars 8975 1727204062.72013: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.72039: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.72604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204062.73137: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204062.73354: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204062.73401: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204062.73498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204062.73751: variable 'network_packages' from source: role '' defaults 8975 1727204062.73987: variable '__network_provider_setup' from source: role '' defaults 8975 1727204062.74006: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204062.74098: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204062.74113: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204062.74194: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204062.74432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204062.77282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204062.77364: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204062.77415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204062.77456: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204062.77489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204062.77591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.77633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.77668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.77716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.77774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.77798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.77826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.77859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.77910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.77930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.78195: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8975 1727204062.78336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.78388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.78535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.78538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.78540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.78598: variable 'ansible_python' from source: facts 8975 1727204062.78640: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8975 1727204062.78744: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204062.78841: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204062.78999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.79031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.79083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.79116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.79171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.79200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204062.79237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204062.79262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.79306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204062.79372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204062.79479: variable 'network_connections' from source: task vars 8975 1727204062.79546: variable 'port2_profile' from source: play vars 8975 1727204062.79662: variable 'port2_profile' from source: play vars 8975 1727204062.79683: variable 'port1_profile' from source: play vars 8975 1727204062.79795: variable 'port1_profile' from source: play vars 8975 1727204062.79806: variable 'controller_profile' from source: play vars 8975 1727204062.79941: variable 'controller_profile' from source: play vars 8975 1727204062.80172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204062.80176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204062.80178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204062.80181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204062.80249: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204062.80588: variable 'network_connections' from source: task vars 8975 1727204062.80594: variable 'port2_profile' from source: play vars 8975 1727204062.80738: variable 'port2_profile' from source: play vars 8975 1727204062.80749: variable 'port1_profile' from source: play vars 8975 1727204062.81132: variable 'port1_profile' from source: play vars 8975 1727204062.81140: variable 'controller_profile' from source: play vars 8975 1727204062.81243: variable 'controller_profile' from source: play vars 8975 1727204062.81286: variable '__network_packages_default_wireless' from source: role '' defaults 8975 1727204062.81369: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204062.81736: variable 'network_connections' from source: task vars 8975 1727204062.81740: variable 'port2_profile' from source: play vars 8975 1727204062.81817: variable 'port2_profile' from source: play vars 8975 1727204062.81829: variable 'port1_profile' from source: play vars 8975 1727204062.81926: variable 'port1_profile' from source: play vars 8975 1727204062.81931: variable 'controller_profile' from source: play vars 8975 1727204062.81983: variable 'controller_profile' from source: play vars 8975 1727204062.82012: variable '__network_packages_default_team' from source: role '' defaults 8975 1727204062.82144: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204062.82480: variable 'network_connections' from source: task vars 8975 1727204062.82485: variable 'port2_profile' from source: play vars 8975 1727204062.82558: variable 'port2_profile' from source: play vars 8975 1727204062.82567: variable 'port1_profile' from source: play vars 8975 1727204062.82636: variable 'port1_profile' from source: play vars 8975 1727204062.82644: variable 'controller_profile' from source: play vars 8975 1727204062.82770: variable 'controller_profile' from source: play vars 8975 1727204062.82774: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204062.82830: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204062.82834: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204062.82894: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204062.83114: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8975 1727204062.83619: variable 'network_connections' from source: task vars 8975 1727204062.83625: variable 'port2_profile' from source: play vars 8975 1727204062.83704: variable 'port2_profile' from source: play vars 8975 1727204062.83712: variable 'port1_profile' from source: play vars 8975 1727204062.83784: variable 'port1_profile' from source: play vars 8975 1727204062.83971: variable 'controller_profile' from source: play vars 8975 1727204062.83974: variable 'controller_profile' from source: play vars 8975 1727204062.83977: variable 'ansible_distribution' from source: facts 8975 1727204062.83979: variable '__network_rh_distros' from source: role '' defaults 8975 1727204062.83982: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.83984: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8975 1727204062.84045: variable 'ansible_distribution' from source: facts 8975 1727204062.84048: variable '__network_rh_distros' from source: role '' defaults 8975 1727204062.84054: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.84061: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8975 1727204062.84229: variable 'ansible_distribution' from source: facts 8975 1727204062.84233: variable '__network_rh_distros' from source: role '' defaults 8975 1727204062.84236: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.84274: variable 'network_provider' from source: set_fact 8975 1727204062.84289: variable 'ansible_facts' from source: unknown 8975 1727204062.85111: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 8975 1727204062.85116: when evaluation is False, skipping this task 8975 1727204062.85118: _execute() done 8975 1727204062.85121: dumping result to json 8975 1727204062.85123: done dumping result, returning 8975 1727204062.85132: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-9356-306d-000000000085] 8975 1727204062.85135: sending task result for task 127b8e07-fff9-9356-306d-000000000085 8975 1727204062.85472: done sending task result for task 127b8e07-fff9-9356-306d-000000000085 8975 1727204062.85476: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 8975 1727204062.85517: no more pending results, returning what we have 8975 1727204062.85520: results queue empty 8975 1727204062.85521: checking for any_errors_fatal 8975 1727204062.85529: done checking for any_errors_fatal 8975 1727204062.85530: checking for max_fail_percentage 8975 1727204062.85532: done checking for max_fail_percentage 8975 1727204062.85533: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.85534: done checking to see if all hosts have failed 8975 1727204062.85534: getting the remaining hosts for this loop 8975 1727204062.85536: done getting the remaining hosts for this loop 8975 1727204062.85544: getting the next task for host managed-node2 8975 1727204062.85551: done getting next task for host managed-node2 8975 1727204062.85556: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8975 1727204062.85559: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.85579: getting variables 8975 1727204062.85580: in VariableManager get_vars() 8975 1727204062.85622: Calling all_inventory to load vars for managed-node2 8975 1727204062.85625: Calling groups_inventory to load vars for managed-node2 8975 1727204062.85629: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.85639: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.85642: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.85645: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.87686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.89882: done with get_vars() 8975 1727204062.89917: done getting variables 8975 1727204062.89991: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.207) 0:00:34.217 ***** 8975 1727204062.90035: entering _queue_task() for managed-node2/package 8975 1727204062.90516: worker is 1 (out of 1 available) 8975 1727204062.90532: exiting _queue_task() for managed-node2/package 8975 1727204062.90544: done queuing things up, now waiting for results queue to drain 8975 1727204062.90545: waiting for pending results... 8975 1727204062.90785: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 8975 1727204062.90977: in run() - task 127b8e07-fff9-9356-306d-000000000086 8975 1727204062.91001: variable 'ansible_search_path' from source: unknown 8975 1727204062.91014: variable 'ansible_search_path' from source: unknown 8975 1727204062.91061: calling self._execute() 8975 1727204062.91183: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.91195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.91213: variable 'omit' from source: magic vars 8975 1727204062.91661: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.91772: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.91832: variable 'network_state' from source: role '' defaults 8975 1727204062.91850: Evaluated conditional (network_state != {}): False 8975 1727204062.91857: when evaluation is False, skipping this task 8975 1727204062.91864: _execute() done 8975 1727204062.91875: dumping result to json 8975 1727204062.91889: done dumping result, returning 8975 1727204062.91903: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-9356-306d-000000000086] 8975 1727204062.91917: sending task result for task 127b8e07-fff9-9356-306d-000000000086 8975 1727204062.92079: done sending task result for task 127b8e07-fff9-9356-306d-000000000086 8975 1727204062.92083: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204062.92139: no more pending results, returning what we have 8975 1727204062.92143: results queue empty 8975 1727204062.92144: checking for any_errors_fatal 8975 1727204062.92151: done checking for any_errors_fatal 8975 1727204062.92152: checking for max_fail_percentage 8975 1727204062.92154: done checking for max_fail_percentage 8975 1727204062.92155: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.92156: done checking to see if all hosts have failed 8975 1727204062.92157: getting the remaining hosts for this loop 8975 1727204062.92159: done getting the remaining hosts for this loop 8975 1727204062.92164: getting the next task for host managed-node2 8975 1727204062.92176: done getting next task for host managed-node2 8975 1727204062.92180: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8975 1727204062.92185: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.92208: getting variables 8975 1727204062.92210: in VariableManager get_vars() 8975 1727204062.92261: Calling all_inventory to load vars for managed-node2 8975 1727204062.92265: Calling groups_inventory to load vars for managed-node2 8975 1727204062.92473: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.92485: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.92488: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.92491: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.94297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204062.96608: done with get_vars() 8975 1727204062.96639: done getting variables 8975 1727204062.96701: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.067) 0:00:34.284 ***** 8975 1727204062.96784: entering _queue_task() for managed-node2/package 8975 1727204062.97302: worker is 1 (out of 1 available) 8975 1727204062.97318: exiting _queue_task() for managed-node2/package 8975 1727204062.97334: done queuing things up, now waiting for results queue to drain 8975 1727204062.97336: waiting for pending results... 8975 1727204062.97888: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 8975 1727204062.97894: in run() - task 127b8e07-fff9-9356-306d-000000000087 8975 1727204062.97899: variable 'ansible_search_path' from source: unknown 8975 1727204062.97902: variable 'ansible_search_path' from source: unknown 8975 1727204062.97904: calling self._execute() 8975 1727204062.97981: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204062.97999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204062.98016: variable 'omit' from source: magic vars 8975 1727204062.98459: variable 'ansible_distribution_major_version' from source: facts 8975 1727204062.98480: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204062.98625: variable 'network_state' from source: role '' defaults 8975 1727204062.98651: Evaluated conditional (network_state != {}): False 8975 1727204062.98659: when evaluation is False, skipping this task 8975 1727204062.98669: _execute() done 8975 1727204062.98678: dumping result to json 8975 1727204062.98686: done dumping result, returning 8975 1727204062.98699: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-9356-306d-000000000087] 8975 1727204062.98713: sending task result for task 127b8e07-fff9-9356-306d-000000000087 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204062.98912: no more pending results, returning what we have 8975 1727204062.98916: results queue empty 8975 1727204062.98917: checking for any_errors_fatal 8975 1727204062.98924: done checking for any_errors_fatal 8975 1727204062.98925: checking for max_fail_percentage 8975 1727204062.98930: done checking for max_fail_percentage 8975 1727204062.98931: checking to see if all hosts have failed and the running result is not ok 8975 1727204062.98933: done checking to see if all hosts have failed 8975 1727204062.98934: getting the remaining hosts for this loop 8975 1727204062.98936: done getting the remaining hosts for this loop 8975 1727204062.98940: getting the next task for host managed-node2 8975 1727204062.98951: done getting next task for host managed-node2 8975 1727204062.98957: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8975 1727204062.98962: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204062.98989: getting variables 8975 1727204062.98991: in VariableManager get_vars() 8975 1727204062.99043: Calling all_inventory to load vars for managed-node2 8975 1727204062.99047: Calling groups_inventory to load vars for managed-node2 8975 1727204062.99049: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204062.99064: Calling all_plugins_play to load vars for managed-node2 8975 1727204062.99173: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204062.99178: Calling groups_plugins_play to load vars for managed-node2 8975 1727204062.99884: done sending task result for task 127b8e07-fff9-9356-306d-000000000087 8975 1727204062.99887: WORKER PROCESS EXITING 8975 1727204063.01129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204063.03346: done with get_vars() 8975 1727204063.03386: done getting variables 8975 1727204063.03457: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.067) 0:00:34.351 ***** 8975 1727204063.03501: entering _queue_task() for managed-node2/service 8975 1727204063.03904: worker is 1 (out of 1 available) 8975 1727204063.03917: exiting _queue_task() for managed-node2/service 8975 1727204063.03932: done queuing things up, now waiting for results queue to drain 8975 1727204063.03934: waiting for pending results... 8975 1727204063.04267: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 8975 1727204063.04469: in run() - task 127b8e07-fff9-9356-306d-000000000088 8975 1727204063.04501: variable 'ansible_search_path' from source: unknown 8975 1727204063.04511: variable 'ansible_search_path' from source: unknown 8975 1727204063.04563: calling self._execute() 8975 1727204063.04682: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204063.04696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204063.04715: variable 'omit' from source: magic vars 8975 1727204063.05172: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.05191: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204063.05335: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204063.05576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204063.08301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204063.08352: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204063.08395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204063.08424: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204063.08445: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204063.08514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.08539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.08557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.08590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.08601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.08641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.08659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.08678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.08708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.08720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.08757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.08774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.08791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.08822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.08837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.08961: variable 'network_connections' from source: task vars 8975 1727204063.08974: variable 'port2_profile' from source: play vars 8975 1727204063.09027: variable 'port2_profile' from source: play vars 8975 1727204063.09037: variable 'port1_profile' from source: play vars 8975 1727204063.09087: variable 'port1_profile' from source: play vars 8975 1727204063.09093: variable 'controller_profile' from source: play vars 8975 1727204063.09139: variable 'controller_profile' from source: play vars 8975 1727204063.09198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204063.09326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204063.09357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204063.09393: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204063.09417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204063.09457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204063.09512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204063.09515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.09771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204063.09774: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204063.09844: variable 'network_connections' from source: task vars 8975 1727204063.09856: variable 'port2_profile' from source: play vars 8975 1727204063.09927: variable 'port2_profile' from source: play vars 8975 1727204063.09941: variable 'port1_profile' from source: play vars 8975 1727204063.10011: variable 'port1_profile' from source: play vars 8975 1727204063.10024: variable 'controller_profile' from source: play vars 8975 1727204063.10095: variable 'controller_profile' from source: play vars 8975 1727204063.10127: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 8975 1727204063.10144: when evaluation is False, skipping this task 8975 1727204063.10152: _execute() done 8975 1727204063.10159: dumping result to json 8975 1727204063.10168: done dumping result, returning 8975 1727204063.10180: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-9356-306d-000000000088] 8975 1727204063.10190: sending task result for task 127b8e07-fff9-9356-306d-000000000088 8975 1727204063.10309: done sending task result for task 127b8e07-fff9-9356-306d-000000000088 8975 1727204063.10316: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 8975 1727204063.10394: no more pending results, returning what we have 8975 1727204063.10397: results queue empty 8975 1727204063.10398: checking for any_errors_fatal 8975 1727204063.10404: done checking for any_errors_fatal 8975 1727204063.10405: checking for max_fail_percentage 8975 1727204063.10407: done checking for max_fail_percentage 8975 1727204063.10408: checking to see if all hosts have failed and the running result is not ok 8975 1727204063.10410: done checking to see if all hosts have failed 8975 1727204063.10411: getting the remaining hosts for this loop 8975 1727204063.10413: done getting the remaining hosts for this loop 8975 1727204063.10418: getting the next task for host managed-node2 8975 1727204063.10433: done getting next task for host managed-node2 8975 1727204063.10437: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8975 1727204063.10441: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204063.10461: getting variables 8975 1727204063.10463: in VariableManager get_vars() 8975 1727204063.10510: Calling all_inventory to load vars for managed-node2 8975 1727204063.10513: Calling groups_inventory to load vars for managed-node2 8975 1727204063.10515: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204063.10526: Calling all_plugins_play to load vars for managed-node2 8975 1727204063.10531: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204063.10534: Calling groups_plugins_play to load vars for managed-node2 8975 1727204063.11941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204063.13110: done with get_vars() 8975 1727204063.13133: done getting variables 8975 1727204063.13189: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.097) 0:00:34.449 ***** 8975 1727204063.13217: entering _queue_task() for managed-node2/service 8975 1727204063.13598: worker is 1 (out of 1 available) 8975 1727204063.13613: exiting _queue_task() for managed-node2/service 8975 1727204063.13627: done queuing things up, now waiting for results queue to drain 8975 1727204063.13628: waiting for pending results... 8975 1727204063.13944: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 8975 1727204063.14159: in run() - task 127b8e07-fff9-9356-306d-000000000089 8975 1727204063.14189: variable 'ansible_search_path' from source: unknown 8975 1727204063.14199: variable 'ansible_search_path' from source: unknown 8975 1727204063.14251: calling self._execute() 8975 1727204063.14378: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204063.14393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204063.14411: variable 'omit' from source: magic vars 8975 1727204063.14928: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.14938: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204063.15074: variable 'network_provider' from source: set_fact 8975 1727204063.15077: variable 'network_state' from source: role '' defaults 8975 1727204063.15087: Evaluated conditional (network_provider == "nm" or network_state != {}): True 8975 1727204063.15095: variable 'omit' from source: magic vars 8975 1727204063.15162: variable 'omit' from source: magic vars 8975 1727204063.15189: variable 'network_service_name' from source: role '' defaults 8975 1727204063.15243: variable 'network_service_name' from source: role '' defaults 8975 1727204063.15325: variable '__network_provider_setup' from source: role '' defaults 8975 1727204063.15328: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204063.15382: variable '__network_service_name_default_nm' from source: role '' defaults 8975 1727204063.15391: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204063.15439: variable '__network_packages_default_nm' from source: role '' defaults 8975 1727204063.15617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204063.17472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204063.17595: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204063.17619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204063.17653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204063.17697: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204063.17773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.17801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.17836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.17911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.17914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.17994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.18019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.18271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.18275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.18278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.18412: variable '__network_packages_default_gobject_packages' from source: role '' defaults 8975 1727204063.18555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.18586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.18628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.18679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.18699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.18962: variable 'ansible_python' from source: facts 8975 1727204063.18967: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 8975 1727204063.19024: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204063.19110: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204063.19212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.19233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.19252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.19293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.19304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.19344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.19364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.19386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.19415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.19426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.19535: variable 'network_connections' from source: task vars 8975 1727204063.19542: variable 'port2_profile' from source: play vars 8975 1727204063.19604: variable 'port2_profile' from source: play vars 8975 1727204063.19615: variable 'port1_profile' from source: play vars 8975 1727204063.19673: variable 'port1_profile' from source: play vars 8975 1727204063.19683: variable 'controller_profile' from source: play vars 8975 1727204063.19741: variable 'controller_profile' from source: play vars 8975 1727204063.19822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204063.19983: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204063.20023: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204063.20059: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204063.20093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204063.20145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204063.20168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204063.20195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.20219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204063.20262: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204063.20474: variable 'network_connections' from source: task vars 8975 1727204063.20478: variable 'port2_profile' from source: play vars 8975 1727204063.20538: variable 'port2_profile' from source: play vars 8975 1727204063.20548: variable 'port1_profile' from source: play vars 8975 1727204063.20606: variable 'port1_profile' from source: play vars 8975 1727204063.20615: variable 'controller_profile' from source: play vars 8975 1727204063.20670: variable 'controller_profile' from source: play vars 8975 1727204063.20699: variable '__network_packages_default_wireless' from source: role '' defaults 8975 1727204063.20760: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204063.20969: variable 'network_connections' from source: task vars 8975 1727204063.20972: variable 'port2_profile' from source: play vars 8975 1727204063.21024: variable 'port2_profile' from source: play vars 8975 1727204063.21035: variable 'port1_profile' from source: play vars 8975 1727204063.21149: variable 'port1_profile' from source: play vars 8975 1727204063.21152: variable 'controller_profile' from source: play vars 8975 1727204063.21247: variable 'controller_profile' from source: play vars 8975 1727204063.21252: variable '__network_packages_default_team' from source: role '' defaults 8975 1727204063.21286: variable '__network_team_connections_defined' from source: role '' defaults 8975 1727204063.21599: variable 'network_connections' from source: task vars 8975 1727204063.21602: variable 'port2_profile' from source: play vars 8975 1727204063.21673: variable 'port2_profile' from source: play vars 8975 1727204063.21681: variable 'port1_profile' from source: play vars 8975 1727204063.21811: variable 'port1_profile' from source: play vars 8975 1727204063.21814: variable 'controller_profile' from source: play vars 8975 1727204063.21827: variable 'controller_profile' from source: play vars 8975 1727204063.21891: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204063.21957: variable '__network_service_name_default_initscripts' from source: role '' defaults 8975 1727204063.21964: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204063.22044: variable '__network_packages_default_initscripts' from source: role '' defaults 8975 1727204063.22287: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 8975 1727204063.22704: variable 'network_connections' from source: task vars 8975 1727204063.22710: variable 'port2_profile' from source: play vars 8975 1727204063.22758: variable 'port2_profile' from source: play vars 8975 1727204063.22767: variable 'port1_profile' from source: play vars 8975 1727204063.22811: variable 'port1_profile' from source: play vars 8975 1727204063.22820: variable 'controller_profile' from source: play vars 8975 1727204063.22873: variable 'controller_profile' from source: play vars 8975 1727204063.22880: variable 'ansible_distribution' from source: facts 8975 1727204063.22883: variable '__network_rh_distros' from source: role '' defaults 8975 1727204063.22890: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.22904: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 8975 1727204063.23032: variable 'ansible_distribution' from source: facts 8975 1727204063.23036: variable '__network_rh_distros' from source: role '' defaults 8975 1727204063.23043: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.23051: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 8975 1727204063.23176: variable 'ansible_distribution' from source: facts 8975 1727204063.23179: variable '__network_rh_distros' from source: role '' defaults 8975 1727204063.23185: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.23212: variable 'network_provider' from source: set_fact 8975 1727204063.23235: variable 'omit' from source: magic vars 8975 1727204063.23262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204063.23286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204063.23300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204063.23315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204063.23323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204063.23352: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204063.23355: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204063.23358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204063.23439: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204063.23442: Set connection var ansible_connection to ssh 8975 1727204063.23445: Set connection var ansible_shell_executable to /bin/sh 8975 1727204063.23453: Set connection var ansible_timeout to 10 8975 1727204063.23456: Set connection var ansible_shell_type to sh 8975 1727204063.23467: Set connection var ansible_pipelining to False 8975 1727204063.23489: variable 'ansible_shell_executable' from source: unknown 8975 1727204063.23492: variable 'ansible_connection' from source: unknown 8975 1727204063.23495: variable 'ansible_module_compression' from source: unknown 8975 1727204063.23498: variable 'ansible_shell_type' from source: unknown 8975 1727204063.23500: variable 'ansible_shell_executable' from source: unknown 8975 1727204063.23502: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204063.23505: variable 'ansible_pipelining' from source: unknown 8975 1727204063.23507: variable 'ansible_timeout' from source: unknown 8975 1727204063.23513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204063.23595: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204063.23604: variable 'omit' from source: magic vars 8975 1727204063.23611: starting attempt loop 8975 1727204063.23614: running the handler 8975 1727204063.23677: variable 'ansible_facts' from source: unknown 8975 1727204063.24391: _low_level_execute_command(): starting 8975 1727204063.24395: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204063.24944: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.24949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204063.24953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.25018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.25021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.25023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.25098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.26873: stdout chunk (state=3): >>>/root <<< 8975 1727204063.26978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204063.27041: stderr chunk (state=3): >>><<< 8975 1727204063.27045: stdout chunk (state=3): >>><<< 8975 1727204063.27068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204063.27079: _low_level_execute_command(): starting 8975 1727204063.27086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175 `" && echo ansible-tmp-1727204063.2706873-11631-66711255209175="` echo /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175 `" ) && sleep 0' 8975 1727204063.27572: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.27595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.27599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.27660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.27664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.27670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.27744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.29738: stdout chunk (state=3): >>>ansible-tmp-1727204063.2706873-11631-66711255209175=/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175 <<< 8975 1727204063.29849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204063.29909: stderr chunk (state=3): >>><<< 8975 1727204063.29912: stdout chunk (state=3): >>><<< 8975 1727204063.29927: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.2706873-11631-66711255209175=/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204063.29964: variable 'ansible_module_compression' from source: unknown 8975 1727204063.30011: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8975 1727204063.30073: variable 'ansible_facts' from source: unknown 8975 1727204063.30213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py 8975 1727204063.30348: Sending initial data 8975 1727204063.30351: Sent initial data (154 bytes) 8975 1727204063.31104: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.31147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.31151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.31190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.31269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.32883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204063.32971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204063.33032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmp2_9sgche /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py <<< 8975 1727204063.33036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py" <<< 8975 1727204063.33127: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmp2_9sgche" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py" <<< 8975 1727204063.34537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204063.34674: stderr chunk (state=3): >>><<< 8975 1727204063.34678: stdout chunk (state=3): >>><<< 8975 1727204063.34680: done transferring module to remote 8975 1727204063.34682: _low_level_execute_command(): starting 8975 1727204063.34685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/ /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py && sleep 0' 8975 1727204063.35158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.35162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.35165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.35170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.35219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.35223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.35227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.35295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.37242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204063.37246: stdout chunk (state=3): >>><<< 8975 1727204063.37249: stderr chunk (state=3): >>><<< 8975 1727204063.37264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204063.37361: _low_level_execute_command(): starting 8975 1727204063.37367: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/AnsiballZ_systemd.py && sleep 0' 8975 1727204063.37991: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204063.38018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204063.38032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.38133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.38166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.38184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.38206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.38318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.70288: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4542464", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3516575744", "CPUUsageNSec": "513845000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 8975 1727204063.70436: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8975 1727204063.72339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204063.72344: stdout chunk (state=3): >>><<< 8975 1727204063.72346: stderr chunk (state=3): >>><<< 8975 1727204063.72384: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4542464", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3516575744", "CPUUsageNSec": "513845000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204063.72968: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204063.72972: _low_level_execute_command(): starting 8975 1727204063.72975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.2706873-11631-66711255209175/ > /dev/null 2>&1 && sleep 0' 8975 1727204063.73607: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204063.73617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204063.73632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.73649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204063.73661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204063.73670: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204063.73685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.73699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204063.73707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204063.73714: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204063.73725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204063.73734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204063.73860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204063.73863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204063.73867: stderr chunk (state=3): >>>debug2: match found <<< 8975 1727204063.73870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204063.73872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204063.73874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204063.73890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204063.74083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204063.76075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204063.76095: stderr chunk (state=3): >>><<< 8975 1727204063.76104: stdout chunk (state=3): >>><<< 8975 1727204063.76126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204063.76271: handler run complete 8975 1727204063.76275: attempt loop complete, returning result 8975 1727204063.76277: _execute() done 8975 1727204063.76280: dumping result to json 8975 1727204063.76282: done dumping result, returning 8975 1727204063.76284: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-9356-306d-000000000089] 8975 1727204063.76286: sending task result for task 127b8e07-fff9-9356-306d-000000000089 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204063.76923: no more pending results, returning what we have 8975 1727204063.76930: results queue empty 8975 1727204063.76931: checking for any_errors_fatal 8975 1727204063.76937: done checking for any_errors_fatal 8975 1727204063.76938: checking for max_fail_percentage 8975 1727204063.76940: done checking for max_fail_percentage 8975 1727204063.77037: checking to see if all hosts have failed and the running result is not ok 8975 1727204063.77040: done checking to see if all hosts have failed 8975 1727204063.77041: getting the remaining hosts for this loop 8975 1727204063.77043: done getting the remaining hosts for this loop 8975 1727204063.77048: getting the next task for host managed-node2 8975 1727204063.77062: done getting next task for host managed-node2 8975 1727204063.77177: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8975 1727204063.77182: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204063.77197: getting variables 8975 1727204063.77199: in VariableManager get_vars() 8975 1727204063.77245: Calling all_inventory to load vars for managed-node2 8975 1727204063.77249: Calling groups_inventory to load vars for managed-node2 8975 1727204063.77251: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204063.77263: Calling all_plugins_play to load vars for managed-node2 8975 1727204063.77580: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204063.77589: Calling groups_plugins_play to load vars for managed-node2 8975 1727204063.78194: done sending task result for task 127b8e07-fff9-9356-306d-000000000089 8975 1727204063.78199: WORKER PROCESS EXITING 8975 1727204063.79642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204063.84348: done with get_vars() 8975 1727204063.84513: done getting variables 8975 1727204063.84700: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.715) 0:00:35.164 ***** 8975 1727204063.84744: entering _queue_task() for managed-node2/service 8975 1727204063.85504: worker is 1 (out of 1 available) 8975 1727204063.85520: exiting _queue_task() for managed-node2/service 8975 1727204063.85537: done queuing things up, now waiting for results queue to drain 8975 1727204063.85539: waiting for pending results... 8975 1727204063.85945: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 8975 1727204063.86280: in run() - task 127b8e07-fff9-9356-306d-00000000008a 8975 1727204063.86309: variable 'ansible_search_path' from source: unknown 8975 1727204063.86318: variable 'ansible_search_path' from source: unknown 8975 1727204063.86374: calling self._execute() 8975 1727204063.86506: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204063.86552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204063.86556: variable 'omit' from source: magic vars 8975 1727204063.87000: variable 'ansible_distribution_major_version' from source: facts 8975 1727204063.87019: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204063.87163: variable 'network_provider' from source: set_fact 8975 1727204063.87177: Evaluated conditional (network_provider == "nm"): True 8975 1727204063.87292: variable '__network_wpa_supplicant_required' from source: role '' defaults 8975 1727204063.87417: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 8975 1727204063.87609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204063.90182: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204063.90311: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204063.90325: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204063.90371: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204063.90403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204063.90525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.90637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.90643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.90658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.90680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.90743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.90779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.90810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.90868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.90890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.90941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204063.90979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204063.91071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.91076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204063.91079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204063.91291: variable 'network_connections' from source: task vars 8975 1727204063.91297: variable 'port2_profile' from source: play vars 8975 1727204063.91361: variable 'port2_profile' from source: play vars 8975 1727204063.91379: variable 'port1_profile' from source: play vars 8975 1727204063.91459: variable 'port1_profile' from source: play vars 8975 1727204063.91474: variable 'controller_profile' from source: play vars 8975 1727204063.91551: variable 'controller_profile' from source: play vars 8975 1727204063.91729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8975 1727204063.91863: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8975 1727204063.91912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8975 1727204063.91960: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8975 1727204063.91998: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8975 1727204063.92064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8975 1727204063.92163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8975 1727204063.92169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204063.92171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8975 1727204063.92219: variable '__network_wireless_connections_defined' from source: role '' defaults 8975 1727204063.92516: variable 'network_connections' from source: task vars 8975 1727204063.92571: variable 'port2_profile' from source: play vars 8975 1727204063.92619: variable 'port2_profile' from source: play vars 8975 1727204063.92637: variable 'port1_profile' from source: play vars 8975 1727204063.92710: variable 'port1_profile' from source: play vars 8975 1727204063.92744: variable 'controller_profile' from source: play vars 8975 1727204063.92836: variable 'controller_profile' from source: play vars 8975 1727204063.92869: Evaluated conditional (__network_wpa_supplicant_required): False 8975 1727204063.92931: when evaluation is False, skipping this task 8975 1727204063.92935: _execute() done 8975 1727204063.92937: dumping result to json 8975 1727204063.92945: done dumping result, returning 8975 1727204063.92948: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-9356-306d-00000000008a] 8975 1727204063.92950: sending task result for task 127b8e07-fff9-9356-306d-00000000008a skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 8975 1727204063.93102: no more pending results, returning what we have 8975 1727204063.93106: results queue empty 8975 1727204063.93107: checking for any_errors_fatal 8975 1727204063.93136: done checking for any_errors_fatal 8975 1727204063.93138: checking for max_fail_percentage 8975 1727204063.93140: done checking for max_fail_percentage 8975 1727204063.93142: checking to see if all hosts have failed and the running result is not ok 8975 1727204063.93143: done checking to see if all hosts have failed 8975 1727204063.93144: getting the remaining hosts for this loop 8975 1727204063.93146: done getting the remaining hosts for this loop 8975 1727204063.93151: getting the next task for host managed-node2 8975 1727204063.93164: done getting next task for host managed-node2 8975 1727204063.93169: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 8975 1727204063.93174: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204063.93196: getting variables 8975 1727204063.93198: in VariableManager get_vars() 8975 1727204063.93249: Calling all_inventory to load vars for managed-node2 8975 1727204063.93253: Calling groups_inventory to load vars for managed-node2 8975 1727204063.93255: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204063.93488: Calling all_plugins_play to load vars for managed-node2 8975 1727204063.93494: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204063.93499: Calling groups_plugins_play to load vars for managed-node2 8975 1727204063.94106: done sending task result for task 127b8e07-fff9-9356-306d-00000000008a 8975 1727204063.94111: WORKER PROCESS EXITING 8975 1727204063.95571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204063.99092: done with get_vars() 8975 1727204063.99138: done getting variables 8975 1727204063.99210: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.145) 0:00:35.309 ***** 8975 1727204063.99256: entering _queue_task() for managed-node2/service 8975 1727204063.99657: worker is 1 (out of 1 available) 8975 1727204063.99673: exiting _queue_task() for managed-node2/service 8975 1727204063.99692: done queuing things up, now waiting for results queue to drain 8975 1727204063.99694: waiting for pending results... 8975 1727204064.00021: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 8975 1727204064.00248: in run() - task 127b8e07-fff9-9356-306d-00000000008b 8975 1727204064.00253: variable 'ansible_search_path' from source: unknown 8975 1727204064.00257: variable 'ansible_search_path' from source: unknown 8975 1727204064.00280: calling self._execute() 8975 1727204064.00398: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204064.00461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204064.00466: variable 'omit' from source: magic vars 8975 1727204064.00873: variable 'ansible_distribution_major_version' from source: facts 8975 1727204064.00906: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204064.01064: variable 'network_provider' from source: set_fact 8975 1727204064.01119: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204064.01122: when evaluation is False, skipping this task 8975 1727204064.01124: _execute() done 8975 1727204064.01133: dumping result to json 8975 1727204064.01135: done dumping result, returning 8975 1727204064.01138: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-9356-306d-00000000008b] 8975 1727204064.01140: sending task result for task 127b8e07-fff9-9356-306d-00000000008b skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 8975 1727204064.01485: no more pending results, returning what we have 8975 1727204064.01489: results queue empty 8975 1727204064.01490: checking for any_errors_fatal 8975 1727204064.01500: done checking for any_errors_fatal 8975 1727204064.01501: checking for max_fail_percentage 8975 1727204064.01503: done checking for max_fail_percentage 8975 1727204064.01504: checking to see if all hosts have failed and the running result is not ok 8975 1727204064.01506: done checking to see if all hosts have failed 8975 1727204064.01506: getting the remaining hosts for this loop 8975 1727204064.01508: done getting the remaining hosts for this loop 8975 1727204064.01513: getting the next task for host managed-node2 8975 1727204064.01522: done getting next task for host managed-node2 8975 1727204064.01526: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8975 1727204064.01534: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204064.02082: getting variables 8975 1727204064.02085: in VariableManager get_vars() 8975 1727204064.02136: Calling all_inventory to load vars for managed-node2 8975 1727204064.02139: Calling groups_inventory to load vars for managed-node2 8975 1727204064.02142: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204064.02153: Calling all_plugins_play to load vars for managed-node2 8975 1727204064.02156: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204064.02159: Calling groups_plugins_play to load vars for managed-node2 8975 1727204064.03088: done sending task result for task 127b8e07-fff9-9356-306d-00000000008b 8975 1727204064.03094: WORKER PROCESS EXITING 8975 1727204064.06393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204064.10857: done with get_vars() 8975 1727204064.10902: done getting variables 8975 1727204064.11182: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.119) 0:00:35.429 ***** 8975 1727204064.11225: entering _queue_task() for managed-node2/copy 8975 1727204064.11819: worker is 1 (out of 1 available) 8975 1727204064.11836: exiting _queue_task() for managed-node2/copy 8975 1727204064.11851: done queuing things up, now waiting for results queue to drain 8975 1727204064.11853: waiting for pending results... 8975 1727204064.12219: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 8975 1727204064.12422: in run() - task 127b8e07-fff9-9356-306d-00000000008c 8975 1727204064.12472: variable 'ansible_search_path' from source: unknown 8975 1727204064.12482: variable 'ansible_search_path' from source: unknown 8975 1727204064.12543: calling self._execute() 8975 1727204064.12673: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204064.12686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204064.12704: variable 'omit' from source: magic vars 8975 1727204064.13193: variable 'ansible_distribution_major_version' from source: facts 8975 1727204064.13212: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204064.13355: variable 'network_provider' from source: set_fact 8975 1727204064.13376: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204064.13384: when evaluation is False, skipping this task 8975 1727204064.13392: _execute() done 8975 1727204064.13400: dumping result to json 8975 1727204064.13407: done dumping result, returning 8975 1727204064.13420: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-9356-306d-00000000008c] 8975 1727204064.13437: sending task result for task 127b8e07-fff9-9356-306d-00000000008c skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8975 1727204064.13645: no more pending results, returning what we have 8975 1727204064.13650: results queue empty 8975 1727204064.13651: checking for any_errors_fatal 8975 1727204064.13658: done checking for any_errors_fatal 8975 1727204064.13659: checking for max_fail_percentage 8975 1727204064.13661: done checking for max_fail_percentage 8975 1727204064.13662: checking to see if all hosts have failed and the running result is not ok 8975 1727204064.13664: done checking to see if all hosts have failed 8975 1727204064.13667: getting the remaining hosts for this loop 8975 1727204064.13669: done getting the remaining hosts for this loop 8975 1727204064.13674: getting the next task for host managed-node2 8975 1727204064.13684: done getting next task for host managed-node2 8975 1727204064.13766: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8975 1727204064.13779: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204064.13800: done sending task result for task 127b8e07-fff9-9356-306d-00000000008c 8975 1727204064.13803: WORKER PROCESS EXITING 8975 1727204064.13879: getting variables 8975 1727204064.13882: in VariableManager get_vars() 8975 1727204064.13936: Calling all_inventory to load vars for managed-node2 8975 1727204064.13939: Calling groups_inventory to load vars for managed-node2 8975 1727204064.13942: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204064.13955: Calling all_plugins_play to load vars for managed-node2 8975 1727204064.13960: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204064.13963: Calling groups_plugins_play to load vars for managed-node2 8975 1727204064.16758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204064.21387: done with get_vars() 8975 1727204064.21432: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.102) 0:00:35.532 ***** 8975 1727204064.21522: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 8975 1727204064.22354: worker is 1 (out of 1 available) 8975 1727204064.22573: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 8975 1727204064.22589: done queuing things up, now waiting for results queue to drain 8975 1727204064.22590: waiting for pending results... 8975 1727204064.22872: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 8975 1727204064.23320: in run() - task 127b8e07-fff9-9356-306d-00000000008d 8975 1727204064.23326: variable 'ansible_search_path' from source: unknown 8975 1727204064.23332: variable 'ansible_search_path' from source: unknown 8975 1727204064.23574: calling self._execute() 8975 1727204064.23607: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204064.23695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204064.23715: variable 'omit' from source: magic vars 8975 1727204064.24668: variable 'ansible_distribution_major_version' from source: facts 8975 1727204064.24675: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204064.24678: variable 'omit' from source: magic vars 8975 1727204064.24758: variable 'omit' from source: magic vars 8975 1727204064.25175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8975 1727204064.30223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8975 1727204064.30462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8975 1727204064.30515: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8975 1727204064.30774: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8975 1727204064.30777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8975 1727204064.30910: variable 'network_provider' from source: set_fact 8975 1727204064.31153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8975 1727204064.31260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8975 1727204064.31346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8975 1727204064.31401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8975 1727204064.31489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8975 1727204064.31702: variable 'omit' from source: magic vars 8975 1727204064.31959: variable 'omit' from source: magic vars 8975 1727204064.32261: variable 'network_connections' from source: task vars 8975 1727204064.32317: variable 'port2_profile' from source: play vars 8975 1727204064.32628: variable 'port2_profile' from source: play vars 8975 1727204064.32632: variable 'port1_profile' from source: play vars 8975 1727204064.32635: variable 'port1_profile' from source: play vars 8975 1727204064.32638: variable 'controller_profile' from source: play vars 8975 1727204064.32799: variable 'controller_profile' from source: play vars 8975 1727204064.33153: variable 'omit' from source: magic vars 8975 1727204064.33254: variable '__lsr_ansible_managed' from source: task vars 8975 1727204064.33346: variable '__lsr_ansible_managed' from source: task vars 8975 1727204064.33714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8975 1727204064.34328: Loaded config def from plugin (lookup/template) 8975 1727204064.34383: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 8975 1727204064.34415: File lookup term: get_ansible_managed.j2 8975 1727204064.34418: variable 'ansible_search_path' from source: unknown 8975 1727204064.34424: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 8975 1727204064.34443: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 8975 1727204064.34460: variable 'ansible_search_path' from source: unknown 8975 1727204064.48941: variable 'ansible_managed' from source: unknown 8975 1727204064.49372: variable 'omit' from source: magic vars 8975 1727204064.49377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204064.49380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204064.49382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204064.49385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204064.49387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204064.49389: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204064.49392: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204064.49394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204064.49407: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204064.49416: Set connection var ansible_connection to ssh 8975 1727204064.49426: Set connection var ansible_shell_executable to /bin/sh 8975 1727204064.49436: Set connection var ansible_timeout to 10 8975 1727204064.49443: Set connection var ansible_shell_type to sh 8975 1727204064.49459: Set connection var ansible_pipelining to False 8975 1727204064.49488: variable 'ansible_shell_executable' from source: unknown 8975 1727204064.49497: variable 'ansible_connection' from source: unknown 8975 1727204064.49503: variable 'ansible_module_compression' from source: unknown 8975 1727204064.49509: variable 'ansible_shell_type' from source: unknown 8975 1727204064.49516: variable 'ansible_shell_executable' from source: unknown 8975 1727204064.49522: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204064.49530: variable 'ansible_pipelining' from source: unknown 8975 1727204064.49537: variable 'ansible_timeout' from source: unknown 8975 1727204064.49556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204064.49703: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204064.49723: variable 'omit' from source: magic vars 8975 1727204064.49737: starting attempt loop 8975 1727204064.49745: running the handler 8975 1727204064.49870: _low_level_execute_command(): starting 8975 1727204064.49874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204064.50509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204064.50521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204064.50583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.50634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204064.50652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204064.50885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204064.51075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204064.52870: stdout chunk (state=3): >>>/root <<< 8975 1727204064.52983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204064.53171: stderr chunk (state=3): >>><<< 8975 1727204064.53175: stdout chunk (state=3): >>><<< 8975 1727204064.53180: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204064.53183: _low_level_execute_command(): starting 8975 1727204064.53186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304 `" && echo ansible-tmp-1727204064.530946-11673-149211487879304="` echo /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304 `" ) && sleep 0' 8975 1727204064.54241: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204064.54572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204064.54576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204064.54578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204064.54581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204064.54583: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204064.54585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.54587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 8975 1727204064.54589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204064.54591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 8975 1727204064.54593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204064.54635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204064.54834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204064.55034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204064.57111: stdout chunk (state=3): >>>ansible-tmp-1727204064.530946-11673-149211487879304=/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304 <<< 8975 1727204064.57287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204064.57317: stderr chunk (state=3): >>><<< 8975 1727204064.57320: stdout chunk (state=3): >>><<< 8975 1727204064.57446: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204064.530946-11673-149211487879304=/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204064.57451: variable 'ansible_module_compression' from source: unknown 8975 1727204064.57454: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 8975 1727204064.57685: variable 'ansible_facts' from source: unknown 8975 1727204064.57802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py 8975 1727204064.58174: Sending initial data 8975 1727204064.58178: Sent initial data (166 bytes) 8975 1727204064.58618: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204064.58630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204064.58681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.58738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204064.58751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204064.58760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204064.59069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204064.60763: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8975 1727204064.60999: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204064.61070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204064.61148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpn2r8a4he /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py <<< 8975 1727204064.61152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py" <<< 8975 1727204064.61226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpn2r8a4he" to remote "/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py" <<< 8975 1727204064.63401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204064.63405: stdout chunk (state=3): >>><<< 8975 1727204064.63410: stderr chunk (state=3): >>><<< 8975 1727204064.63485: done transferring module to remote 8975 1727204064.63498: _low_level_execute_command(): starting 8975 1727204064.63504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/ /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py && sleep 0' 8975 1727204064.65660: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204064.65667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204064.65671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.65673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204064.65675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204064.65678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.65874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204064.65892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204064.66000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204064.68023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204064.68032: stderr chunk (state=3): >>><<< 8975 1727204064.68035: stdout chunk (state=3): >>><<< 8975 1727204064.68050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204064.68053: _low_level_execute_command(): starting 8975 1727204064.68152: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/AnsiballZ_network_connections.py && sleep 0' 8975 1727204064.69458: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204064.69465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204064.69475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204064.69478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204064.69589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204064.69593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204064.69619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204064.69757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.26496: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 8975 1727204065.26535: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/ffedb32f-704a-41b8-a516-608af8e03c14: error=unknown <<< 8975 1727204065.29854: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0bd07c0f-2e8d-423f-a6c9-7d5983583758: error=unknown <<< 8975 1727204065.31746: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/717cc466-aa3c-4897-acd8-59beced800de: error=unknown <<< 8975 1727204065.31863: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 8975 1727204065.34044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204065.34048: stdout chunk (state=3): >>><<< 8975 1727204065.34051: stderr chunk (state=3): >>><<< 8975 1727204065.34175: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/ffedb32f-704a-41b8-a516-608af8e03c14: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0bd07c0f-2e8d-423f-a6c9-7d5983583758: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jh_o9b_6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/717cc466-aa3c-4897-acd8-59beced800de: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204065.34184: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204065.34186: _low_level_execute_command(): starting 8975 1727204065.34189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204064.530946-11673-149211487879304/ > /dev/null 2>&1 && sleep 0' 8975 1727204065.34987: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.35101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.35116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.35205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.37317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.37332: stdout chunk (state=3): >>><<< 8975 1727204065.37348: stderr chunk (state=3): >>><<< 8975 1727204065.37372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204065.37571: handler run complete 8975 1727204065.37575: attempt loop complete, returning result 8975 1727204065.37578: _execute() done 8975 1727204065.37580: dumping result to json 8975 1727204065.37582: done dumping result, returning 8975 1727204065.37584: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-9356-306d-00000000008d] 8975 1727204065.37587: sending task result for task 127b8e07-fff9-9356-306d-00000000008d 8975 1727204065.37677: done sending task result for task 127b8e07-fff9-9356-306d-00000000008d 8975 1727204065.37681: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 8975 1727204065.37847: no more pending results, returning what we have 8975 1727204065.37851: results queue empty 8975 1727204065.37852: checking for any_errors_fatal 8975 1727204065.37859: done checking for any_errors_fatal 8975 1727204065.37860: checking for max_fail_percentage 8975 1727204065.37861: done checking for max_fail_percentage 8975 1727204065.37862: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.37864: done checking to see if all hosts have failed 8975 1727204065.37864: getting the remaining hosts for this loop 8975 1727204065.37868: done getting the remaining hosts for this loop 8975 1727204065.37872: getting the next task for host managed-node2 8975 1727204065.37880: done getting next task for host managed-node2 8975 1727204065.37884: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 8975 1727204065.37888: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.37905: getting variables 8975 1727204065.37907: in VariableManager get_vars() 8975 1727204065.37972: Calling all_inventory to load vars for managed-node2 8975 1727204065.37976: Calling groups_inventory to load vars for managed-node2 8975 1727204065.37978: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.37988: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.37991: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.37994: Calling groups_plugins_play to load vars for managed-node2 8975 1727204065.39078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204065.40393: done with get_vars() 8975 1727204065.40434: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:25 -0400 (0:00:01.190) 0:00:36.722 ***** 8975 1727204065.40542: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 8975 1727204065.40951: worker is 1 (out of 1 available) 8975 1727204065.40971: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 8975 1727204065.40988: done queuing things up, now waiting for results queue to drain 8975 1727204065.40989: waiting for pending results... 8975 1727204065.41395: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 8975 1727204065.41460: in run() - task 127b8e07-fff9-9356-306d-00000000008e 8975 1727204065.41494: variable 'ansible_search_path' from source: unknown 8975 1727204065.41498: variable 'ansible_search_path' from source: unknown 8975 1727204065.41536: calling self._execute() 8975 1727204065.41657: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.41675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.41690: variable 'omit' from source: magic vars 8975 1727204065.42106: variable 'ansible_distribution_major_version' from source: facts 8975 1727204065.42116: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204065.42214: variable 'network_state' from source: role '' defaults 8975 1727204065.42223: Evaluated conditional (network_state != {}): False 8975 1727204065.42227: when evaluation is False, skipping this task 8975 1727204065.42230: _execute() done 8975 1727204065.42236: dumping result to json 8975 1727204065.42240: done dumping result, returning 8975 1727204065.42249: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-9356-306d-00000000008e] 8975 1727204065.42255: sending task result for task 127b8e07-fff9-9356-306d-00000000008e 8975 1727204065.42356: done sending task result for task 127b8e07-fff9-9356-306d-00000000008e 8975 1727204065.42359: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 8975 1727204065.42417: no more pending results, returning what we have 8975 1727204065.42421: results queue empty 8975 1727204065.42422: checking for any_errors_fatal 8975 1727204065.42433: done checking for any_errors_fatal 8975 1727204065.42434: checking for max_fail_percentage 8975 1727204065.42436: done checking for max_fail_percentage 8975 1727204065.42437: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.42438: done checking to see if all hosts have failed 8975 1727204065.42439: getting the remaining hosts for this loop 8975 1727204065.42441: done getting the remaining hosts for this loop 8975 1727204065.42445: getting the next task for host managed-node2 8975 1727204065.42454: done getting next task for host managed-node2 8975 1727204065.42458: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8975 1727204065.42462: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.42485: getting variables 8975 1727204065.42487: in VariableManager get_vars() 8975 1727204065.42526: Calling all_inventory to load vars for managed-node2 8975 1727204065.42529: Calling groups_inventory to load vars for managed-node2 8975 1727204065.42531: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.42541: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.42544: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.42546: Calling groups_plugins_play to load vars for managed-node2 8975 1727204065.43636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204065.45569: done with get_vars() 8975 1727204065.45605: done getting variables 8975 1727204065.45676: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.051) 0:00:36.774 ***** 8975 1727204065.45718: entering _queue_task() for managed-node2/debug 8975 1727204065.46074: worker is 1 (out of 1 available) 8975 1727204065.46090: exiting _queue_task() for managed-node2/debug 8975 1727204065.46105: done queuing things up, now waiting for results queue to drain 8975 1727204065.46106: waiting for pending results... 8975 1727204065.46321: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 8975 1727204065.46443: in run() - task 127b8e07-fff9-9356-306d-00000000008f 8975 1727204065.46455: variable 'ansible_search_path' from source: unknown 8975 1727204065.46459: variable 'ansible_search_path' from source: unknown 8975 1727204065.46493: calling self._execute() 8975 1727204065.46580: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.46586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.46596: variable 'omit' from source: magic vars 8975 1727204065.46908: variable 'ansible_distribution_major_version' from source: facts 8975 1727204065.46919: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204065.46925: variable 'omit' from source: magic vars 8975 1727204065.46984: variable 'omit' from source: magic vars 8975 1727204065.47013: variable 'omit' from source: magic vars 8975 1727204065.47058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204065.47091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204065.47108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204065.47123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.47136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.47162: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204065.47175: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.47178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.47257: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204065.47261: Set connection var ansible_connection to ssh 8975 1727204065.47264: Set connection var ansible_shell_executable to /bin/sh 8975 1727204065.47277: Set connection var ansible_timeout to 10 8975 1727204065.47280: Set connection var ansible_shell_type to sh 8975 1727204065.47288: Set connection var ansible_pipelining to False 8975 1727204065.47308: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.47311: variable 'ansible_connection' from source: unknown 8975 1727204065.47314: variable 'ansible_module_compression' from source: unknown 8975 1727204065.47316: variable 'ansible_shell_type' from source: unknown 8975 1727204065.47318: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.47322: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.47329: variable 'ansible_pipelining' from source: unknown 8975 1727204065.47332: variable 'ansible_timeout' from source: unknown 8975 1727204065.47334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.47450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204065.47460: variable 'omit' from source: magic vars 8975 1727204065.47467: starting attempt loop 8975 1727204065.47471: running the handler 8975 1727204065.47583: variable '__network_connections_result' from source: set_fact 8975 1727204065.47635: handler run complete 8975 1727204065.47647: attempt loop complete, returning result 8975 1727204065.47650: _execute() done 8975 1727204065.47652: dumping result to json 8975 1727204065.47655: done dumping result, returning 8975 1727204065.47664: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-9356-306d-00000000008f] 8975 1727204065.47672: sending task result for task 127b8e07-fff9-9356-306d-00000000008f 8975 1727204065.47777: done sending task result for task 127b8e07-fff9-9356-306d-00000000008f 8975 1727204065.47780: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 8975 1727204065.47853: no more pending results, returning what we have 8975 1727204065.47857: results queue empty 8975 1727204065.47858: checking for any_errors_fatal 8975 1727204065.47864: done checking for any_errors_fatal 8975 1727204065.47865: checking for max_fail_percentage 8975 1727204065.47869: done checking for max_fail_percentage 8975 1727204065.47870: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.47871: done checking to see if all hosts have failed 8975 1727204065.47871: getting the remaining hosts for this loop 8975 1727204065.47874: done getting the remaining hosts for this loop 8975 1727204065.47877: getting the next task for host managed-node2 8975 1727204065.47885: done getting next task for host managed-node2 8975 1727204065.47895: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8975 1727204065.47900: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.47911: getting variables 8975 1727204065.47913: in VariableManager get_vars() 8975 1727204065.47955: Calling all_inventory to load vars for managed-node2 8975 1727204065.47958: Calling groups_inventory to load vars for managed-node2 8975 1727204065.47960: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.47970: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.47973: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.47976: Calling groups_plugins_play to load vars for managed-node2 8975 1727204065.49729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204065.50919: done with get_vars() 8975 1727204065.50949: done getting variables 8975 1727204065.51001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.053) 0:00:36.827 ***** 8975 1727204065.51030: entering _queue_task() for managed-node2/debug 8975 1727204065.51332: worker is 1 (out of 1 available) 8975 1727204065.51349: exiting _queue_task() for managed-node2/debug 8975 1727204065.51362: done queuing things up, now waiting for results queue to drain 8975 1727204065.51363: waiting for pending results... 8975 1727204065.51570: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 8975 1727204065.51683: in run() - task 127b8e07-fff9-9356-306d-000000000090 8975 1727204065.51698: variable 'ansible_search_path' from source: unknown 8975 1727204065.51701: variable 'ansible_search_path' from source: unknown 8975 1727204065.51740: calling self._execute() 8975 1727204065.51826: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.51835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.51845: variable 'omit' from source: magic vars 8975 1727204065.52166: variable 'ansible_distribution_major_version' from source: facts 8975 1727204065.52176: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204065.52184: variable 'omit' from source: magic vars 8975 1727204065.52239: variable 'omit' from source: magic vars 8975 1727204065.52276: variable 'omit' from source: magic vars 8975 1727204065.52314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204065.52348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204065.52369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204065.52385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.52395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.52422: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204065.52425: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.52428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.52510: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204065.52514: Set connection var ansible_connection to ssh 8975 1727204065.52516: Set connection var ansible_shell_executable to /bin/sh 8975 1727204065.52523: Set connection var ansible_timeout to 10 8975 1727204065.52526: Set connection var ansible_shell_type to sh 8975 1727204065.52538: Set connection var ansible_pipelining to False 8975 1727204065.52557: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.52561: variable 'ansible_connection' from source: unknown 8975 1727204065.52564: variable 'ansible_module_compression' from source: unknown 8975 1727204065.52568: variable 'ansible_shell_type' from source: unknown 8975 1727204065.52570: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.52573: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.52579: variable 'ansible_pipelining' from source: unknown 8975 1727204065.52582: variable 'ansible_timeout' from source: unknown 8975 1727204065.52584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.52705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204065.52714: variable 'omit' from source: magic vars 8975 1727204065.52719: starting attempt loop 8975 1727204065.52722: running the handler 8975 1727204065.52763: variable '__network_connections_result' from source: set_fact 8975 1727204065.52834: variable '__network_connections_result' from source: set_fact 8975 1727204065.52936: handler run complete 8975 1727204065.52955: attempt loop complete, returning result 8975 1727204065.52958: _execute() done 8975 1727204065.52961: dumping result to json 8975 1727204065.52967: done dumping result, returning 8975 1727204065.52975: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-9356-306d-000000000090] 8975 1727204065.52981: sending task result for task 127b8e07-fff9-9356-306d-000000000090 8975 1727204065.53084: done sending task result for task 127b8e07-fff9-9356-306d-000000000090 8975 1727204065.53087: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 8975 1727204065.53180: no more pending results, returning what we have 8975 1727204065.53184: results queue empty 8975 1727204065.53185: checking for any_errors_fatal 8975 1727204065.53191: done checking for any_errors_fatal 8975 1727204065.53192: checking for max_fail_percentage 8975 1727204065.53194: done checking for max_fail_percentage 8975 1727204065.53195: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.53196: done checking to see if all hosts have failed 8975 1727204065.53197: getting the remaining hosts for this loop 8975 1727204065.53199: done getting the remaining hosts for this loop 8975 1727204065.53202: getting the next task for host managed-node2 8975 1727204065.53211: done getting next task for host managed-node2 8975 1727204065.53214: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8975 1727204065.53218: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.53229: getting variables 8975 1727204065.53231: in VariableManager get_vars() 8975 1727204065.53275: Calling all_inventory to load vars for managed-node2 8975 1727204065.53278: Calling groups_inventory to load vars for managed-node2 8975 1727204065.53280: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.53289: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.53298: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.53301: Calling groups_plugins_play to load vars for managed-node2 8975 1727204065.54277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204065.55462: done with get_vars() 8975 1727204065.55491: done getting variables 8975 1727204065.55543: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.045) 0:00:36.872 ***** 8975 1727204065.55576: entering _queue_task() for managed-node2/debug 8975 1727204065.55869: worker is 1 (out of 1 available) 8975 1727204065.55885: exiting _queue_task() for managed-node2/debug 8975 1727204065.55897: done queuing things up, now waiting for results queue to drain 8975 1727204065.55899: waiting for pending results... 8975 1727204065.56102: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 8975 1727204065.56227: in run() - task 127b8e07-fff9-9356-306d-000000000091 8975 1727204065.56245: variable 'ansible_search_path' from source: unknown 8975 1727204065.56254: variable 'ansible_search_path' from source: unknown 8975 1727204065.56292: calling self._execute() 8975 1727204065.56378: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.56383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.56394: variable 'omit' from source: magic vars 8975 1727204065.56690: variable 'ansible_distribution_major_version' from source: facts 8975 1727204065.56700: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204065.56794: variable 'network_state' from source: role '' defaults 8975 1727204065.56803: Evaluated conditional (network_state != {}): False 8975 1727204065.56806: when evaluation is False, skipping this task 8975 1727204065.56810: _execute() done 8975 1727204065.56813: dumping result to json 8975 1727204065.56816: done dumping result, returning 8975 1727204065.56824: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-9356-306d-000000000091] 8975 1727204065.56833: sending task result for task 127b8e07-fff9-9356-306d-000000000091 8975 1727204065.56932: done sending task result for task 127b8e07-fff9-9356-306d-000000000091 8975 1727204065.56935: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 8975 1727204065.56986: no more pending results, returning what we have 8975 1727204065.56990: results queue empty 8975 1727204065.56991: checking for any_errors_fatal 8975 1727204065.57003: done checking for any_errors_fatal 8975 1727204065.57004: checking for max_fail_percentage 8975 1727204065.57006: done checking for max_fail_percentage 8975 1727204065.57007: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.57008: done checking to see if all hosts have failed 8975 1727204065.57009: getting the remaining hosts for this loop 8975 1727204065.57010: done getting the remaining hosts for this loop 8975 1727204065.57014: getting the next task for host managed-node2 8975 1727204065.57023: done getting next task for host managed-node2 8975 1727204065.57030: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 8975 1727204065.57034: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.57056: getting variables 8975 1727204065.57057: in VariableManager get_vars() 8975 1727204065.57105: Calling all_inventory to load vars for managed-node2 8975 1727204065.57108: Calling groups_inventory to load vars for managed-node2 8975 1727204065.57110: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.57119: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.57122: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.57125: Calling groups_plugins_play to load vars for managed-node2 8975 1727204065.58209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204065.59397: done with get_vars() 8975 1727204065.59423: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.039) 0:00:36.912 ***** 8975 1727204065.59514: entering _queue_task() for managed-node2/ping 8975 1727204065.59809: worker is 1 (out of 1 available) 8975 1727204065.59825: exiting _queue_task() for managed-node2/ping 8975 1727204065.59841: done queuing things up, now waiting for results queue to drain 8975 1727204065.59842: waiting for pending results... 8975 1727204065.60037: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 8975 1727204065.60154: in run() - task 127b8e07-fff9-9356-306d-000000000092 8975 1727204065.60168: variable 'ansible_search_path' from source: unknown 8975 1727204065.60171: variable 'ansible_search_path' from source: unknown 8975 1727204065.60207: calling self._execute() 8975 1727204065.60289: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.60293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.60307: variable 'omit' from source: magic vars 8975 1727204065.60613: variable 'ansible_distribution_major_version' from source: facts 8975 1727204065.60626: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204065.60632: variable 'omit' from source: magic vars 8975 1727204065.60684: variable 'omit' from source: magic vars 8975 1727204065.60713: variable 'omit' from source: magic vars 8975 1727204065.60755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204065.60788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204065.60804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204065.60821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.60832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204065.60861: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204065.60864: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.60869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.60943: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204065.60946: Set connection var ansible_connection to ssh 8975 1727204065.60949: Set connection var ansible_shell_executable to /bin/sh 8975 1727204065.60956: Set connection var ansible_timeout to 10 8975 1727204065.60960: Set connection var ansible_shell_type to sh 8975 1727204065.60974: Set connection var ansible_pipelining to False 8975 1727204065.60992: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.60995: variable 'ansible_connection' from source: unknown 8975 1727204065.60998: variable 'ansible_module_compression' from source: unknown 8975 1727204065.61000: variable 'ansible_shell_type' from source: unknown 8975 1727204065.61003: variable 'ansible_shell_executable' from source: unknown 8975 1727204065.61005: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204065.61010: variable 'ansible_pipelining' from source: unknown 8975 1727204065.61013: variable 'ansible_timeout' from source: unknown 8975 1727204065.61017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204065.61186: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8975 1727204065.61192: variable 'omit' from source: magic vars 8975 1727204065.61195: starting attempt loop 8975 1727204065.61198: running the handler 8975 1727204065.61214: _low_level_execute_command(): starting 8975 1727204065.61220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204065.61798: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.61803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.61807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.61870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204065.61876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204065.61879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.61956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.63753: stdout chunk (state=3): >>>/root <<< 8975 1727204065.63855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.63921: stderr chunk (state=3): >>><<< 8975 1727204065.63925: stdout chunk (state=3): >>><<< 8975 1727204065.63946: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204065.63958: _low_level_execute_command(): starting 8975 1727204065.63964: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940 `" && echo ansible-tmp-1727204065.639463-11724-44993151052940="` echo /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940 `" ) && sleep 0' 8975 1727204065.64473: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204065.64477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204065.64480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204065.64492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204065.64513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.64551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204065.64554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204065.64556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.64634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.66667: stdout chunk (state=3): >>>ansible-tmp-1727204065.639463-11724-44993151052940=/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940 <<< 8975 1727204065.66786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.66851: stderr chunk (state=3): >>><<< 8975 1727204065.66854: stdout chunk (state=3): >>><<< 8975 1727204065.66874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.639463-11724-44993151052940=/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204065.66919: variable 'ansible_module_compression' from source: unknown 8975 1727204065.66955: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 8975 1727204065.66989: variable 'ansible_facts' from source: unknown 8975 1727204065.67046: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py 8975 1727204065.67157: Sending initial data 8975 1727204065.67161: Sent initial data (150 bytes) 8975 1727204065.67685: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.67689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.67691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204065.67694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204065.67697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.67750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204065.67757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.67830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.69499: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204065.69581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204065.69665: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmptj25yc0m /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py <<< 8975 1727204065.69671: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py" <<< 8975 1727204065.69741: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmptj25yc0m" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py" <<< 8975 1727204065.70554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.70718: stderr chunk (state=3): >>><<< 8975 1727204065.70722: stdout chunk (state=3): >>><<< 8975 1727204065.70724: done transferring module to remote 8975 1727204065.70727: _low_level_execute_command(): starting 8975 1727204065.70729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/ /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py && sleep 0' 8975 1727204065.71338: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204065.71356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204065.71374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.71394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204065.71425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 8975 1727204065.71438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 8975 1727204065.71486: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.71550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204065.71603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204065.71606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.71729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.73691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.73722: stderr chunk (state=3): >>><<< 8975 1727204065.73732: stdout chunk (state=3): >>><<< 8975 1727204065.73867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204065.73872: _low_level_execute_command(): starting 8975 1727204065.73875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/AnsiballZ_ping.py && sleep 0' 8975 1727204065.74501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204065.74589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.74632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204065.74657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204065.74671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.75106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.91865: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 8975 1727204065.93290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204065.93355: stderr chunk (state=3): >>><<< 8975 1727204065.93408: stdout chunk (state=3): >>><<< 8975 1727204065.93411: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204065.93426: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204065.93441: _low_level_execute_command(): starting 8975 1727204065.93450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.639463-11724-44993151052940/ > /dev/null 2>&1 && sleep 0' 8975 1727204065.94508: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204065.94512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204065.94529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204065.94558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204065.94578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204065.94821: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204065.94943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204065.95030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204065.95099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204065.97474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204065.97478: stdout chunk (state=3): >>><<< 8975 1727204065.97486: stderr chunk (state=3): >>><<< 8975 1727204065.97489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204065.97492: handler run complete 8975 1727204065.97494: attempt loop complete, returning result 8975 1727204065.97497: _execute() done 8975 1727204065.97499: dumping result to json 8975 1727204065.97501: done dumping result, returning 8975 1727204065.97505: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-9356-306d-000000000092] 8975 1727204065.97508: sending task result for task 127b8e07-fff9-9356-306d-000000000092 8975 1727204065.97590: done sending task result for task 127b8e07-fff9-9356-306d-000000000092 8975 1727204065.97594: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 8975 1727204065.97669: no more pending results, returning what we have 8975 1727204065.97674: results queue empty 8975 1727204065.97675: checking for any_errors_fatal 8975 1727204065.97682: done checking for any_errors_fatal 8975 1727204065.97683: checking for max_fail_percentage 8975 1727204065.97685: done checking for max_fail_percentage 8975 1727204065.97686: checking to see if all hosts have failed and the running result is not ok 8975 1727204065.97687: done checking to see if all hosts have failed 8975 1727204065.97688: getting the remaining hosts for this loop 8975 1727204065.97691: done getting the remaining hosts for this loop 8975 1727204065.97695: getting the next task for host managed-node2 8975 1727204065.97709: done getting next task for host managed-node2 8975 1727204065.97711: ^ task is: TASK: meta (role_complete) 8975 1727204065.97716: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204065.97731: getting variables 8975 1727204065.97733: in VariableManager get_vars() 8975 1727204065.98011: Calling all_inventory to load vars for managed-node2 8975 1727204065.98015: Calling groups_inventory to load vars for managed-node2 8975 1727204065.98018: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204065.98032: Calling all_plugins_play to load vars for managed-node2 8975 1727204065.98036: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204065.98040: Calling groups_plugins_play to load vars for managed-node2 8975 1727204066.00476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204066.02761: done with get_vars() 8975 1727204066.02799: done getting variables 8975 1727204066.02900: done queuing things up, now waiting for results queue to drain 8975 1727204066.02902: results queue empty 8975 1727204066.02903: checking for any_errors_fatal 8975 1727204066.02906: done checking for any_errors_fatal 8975 1727204066.02907: checking for max_fail_percentage 8975 1727204066.02909: done checking for max_fail_percentage 8975 1727204066.02909: checking to see if all hosts have failed and the running result is not ok 8975 1727204066.02910: done checking to see if all hosts have failed 8975 1727204066.02911: getting the remaining hosts for this loop 8975 1727204066.02912: done getting the remaining hosts for this loop 8975 1727204066.02915: getting the next task for host managed-node2 8975 1727204066.02920: done getting next task for host managed-node2 8975 1727204066.02923: ^ task is: TASK: Delete the device '{{ controller_device }}' 8975 1727204066.02925: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204066.02927: getting variables 8975 1727204066.02928: in VariableManager get_vars() 8975 1727204066.02949: Calling all_inventory to load vars for managed-node2 8975 1727204066.02951: Calling groups_inventory to load vars for managed-node2 8975 1727204066.02954: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204066.02959: Calling all_plugins_play to load vars for managed-node2 8975 1727204066.02962: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204066.02965: Calling groups_plugins_play to load vars for managed-node2 8975 1727204066.04473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204066.06861: done with get_vars() 8975 1727204066.06899: done getting variables 8975 1727204066.06963: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 8975 1727204066.07110: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.476) 0:00:37.388 ***** 8975 1727204066.07147: entering _queue_task() for managed-node2/command 8975 1727204066.07548: worker is 1 (out of 1 available) 8975 1727204066.07571: exiting _queue_task() for managed-node2/command 8975 1727204066.07585: done queuing things up, now waiting for results queue to drain 8975 1727204066.07587: waiting for pending results... 8975 1727204066.07998: running TaskExecutor() for managed-node2/TASK: Delete the device 'deprecated-bond' 8975 1727204066.08003: in run() - task 127b8e07-fff9-9356-306d-0000000000c2 8975 1727204066.08006: variable 'ansible_search_path' from source: unknown 8975 1727204066.08009: calling self._execute() 8975 1727204066.08115: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.08129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.08145: variable 'omit' from source: magic vars 8975 1727204066.08551: variable 'ansible_distribution_major_version' from source: facts 8975 1727204066.08573: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204066.08585: variable 'omit' from source: magic vars 8975 1727204066.08612: variable 'omit' from source: magic vars 8975 1727204066.08724: variable 'controller_device' from source: play vars 8975 1727204066.08850: variable 'omit' from source: magic vars 8975 1727204066.08855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204066.08858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204066.08879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204066.08903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204066.08922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204066.08964: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204066.08975: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.08983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.09103: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204066.09112: Set connection var ansible_connection to ssh 8975 1727204066.09124: Set connection var ansible_shell_executable to /bin/sh 8975 1727204066.09134: Set connection var ansible_timeout to 10 8975 1727204066.09142: Set connection var ansible_shell_type to sh 8975 1727204066.09160: Set connection var ansible_pipelining to False 8975 1727204066.09195: variable 'ansible_shell_executable' from source: unknown 8975 1727204066.09203: variable 'ansible_connection' from source: unknown 8975 1727204066.09212: variable 'ansible_module_compression' from source: unknown 8975 1727204066.09219: variable 'ansible_shell_type' from source: unknown 8975 1727204066.09226: variable 'ansible_shell_executable' from source: unknown 8975 1727204066.09233: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.09270: variable 'ansible_pipelining' from source: unknown 8975 1727204066.09273: variable 'ansible_timeout' from source: unknown 8975 1727204066.09276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.09431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204066.09453: variable 'omit' from source: magic vars 8975 1727204066.09502: starting attempt loop 8975 1727204066.09506: running the handler 8975 1727204066.09508: _low_level_execute_command(): starting 8975 1727204066.09511: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204066.10399: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.10448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204066.10480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.10508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.10624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.12402: stdout chunk (state=3): >>>/root <<< 8975 1727204066.12625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.12629: stdout chunk (state=3): >>><<< 8975 1727204066.12632: stderr chunk (state=3): >>><<< 8975 1727204066.12671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.12771: _low_level_execute_command(): starting 8975 1727204066.12776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264 `" && echo ansible-tmp-1727204066.1265812-11752-272207761471264="` echo /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264 `" ) && sleep 0' 8975 1727204066.13416: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204066.13429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204066.13484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204066.13488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.13565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.13615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204066.13642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.13675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.13783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.15868: stdout chunk (state=3): >>>ansible-tmp-1727204066.1265812-11752-272207761471264=/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264 <<< 8975 1727204066.16381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.16385: stdout chunk (state=3): >>><<< 8975 1727204066.16388: stderr chunk (state=3): >>><<< 8975 1727204066.16390: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.1265812-11752-272207761471264=/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.16607: variable 'ansible_module_compression' from source: unknown 8975 1727204066.16610: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204066.16613: variable 'ansible_facts' from source: unknown 8975 1727204066.16673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py 8975 1727204066.16956: Sending initial data 8975 1727204066.16970: Sent initial data (155 bytes) 8975 1727204066.17588: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204066.17634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204066.17738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.17761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204066.17797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.17818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.17924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.19605: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 8975 1727204066.19633: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204066.19698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204066.19771: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpfg_ftzwz /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py <<< 8975 1727204066.19775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py" <<< 8975 1727204066.19840: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpfg_ftzwz" to remote "/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py" <<< 8975 1727204066.20741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.21075: stderr chunk (state=3): >>><<< 8975 1727204066.21079: stdout chunk (state=3): >>><<< 8975 1727204066.21082: done transferring module to remote 8975 1727204066.21084: _low_level_execute_command(): starting 8975 1727204066.21087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/ /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py && sleep 0' 8975 1727204066.21980: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204066.21985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.21989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204066.21991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.22220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.22247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.22315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.24299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.24333: stdout chunk (state=3): >>><<< 8975 1727204066.24336: stderr chunk (state=3): >>><<< 8975 1727204066.24452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.24456: _low_level_execute_command(): starting 8975 1727204066.24458: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/AnsiballZ_command.py && sleep 0' 8975 1727204066.25306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.25364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.25444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.42945: stdout chunk (state=3): >>> <<< 8975 1727204066.42968: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:26.419959", "end": "2024-09-24 14:54:26.427755", "delta": "0:00:00.007796", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204066.44506: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 8975 1727204066.44510: stdout chunk (state=3): >>><<< 8975 1727204066.44513: stderr chunk (state=3): >>><<< 8975 1727204066.44875: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:26.419959", "end": "2024-09-24 14:54:26.427755", "delta": "0:00:00.007796", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 8975 1727204066.44881: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204066.44884: _low_level_execute_command(): starting 8975 1727204066.44886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.1265812-11752-272207761471264/ > /dev/null 2>&1 && sleep 0' 8975 1727204066.46171: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204066.46189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.46260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.46372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.46495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.48509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.48699: stderr chunk (state=3): >>><<< 8975 1727204066.48704: stdout chunk (state=3): >>><<< 8975 1727204066.48730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.48931: handler run complete 8975 1727204066.48935: Evaluated conditional (False): False 8975 1727204066.48937: Evaluated conditional (False): False 8975 1727204066.48939: attempt loop complete, returning result 8975 1727204066.48941: _execute() done 8975 1727204066.48944: dumping result to json 8975 1727204066.48946: done dumping result, returning 8975 1727204066.48948: done running TaskExecutor() for managed-node2/TASK: Delete the device 'deprecated-bond' [127b8e07-fff9-9356-306d-0000000000c2] 8975 1727204066.48950: sending task result for task 127b8e07-fff9-9356-306d-0000000000c2 ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.007796", "end": "2024-09-24 14:54:26.427755", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:26.419959" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 8975 1727204066.49435: no more pending results, returning what we have 8975 1727204066.49439: results queue empty 8975 1727204066.49440: checking for any_errors_fatal 8975 1727204066.49442: done checking for any_errors_fatal 8975 1727204066.49443: checking for max_fail_percentage 8975 1727204066.49446: done checking for max_fail_percentage 8975 1727204066.49447: checking to see if all hosts have failed and the running result is not ok 8975 1727204066.49448: done checking to see if all hosts have failed 8975 1727204066.49449: getting the remaining hosts for this loop 8975 1727204066.49451: done getting the remaining hosts for this loop 8975 1727204066.49456: getting the next task for host managed-node2 8975 1727204066.49470: done getting next task for host managed-node2 8975 1727204066.49473: ^ task is: TASK: Remove test interfaces 8975 1727204066.49478: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204066.49485: getting variables 8975 1727204066.49487: in VariableManager get_vars() 8975 1727204066.49535: Calling all_inventory to load vars for managed-node2 8975 1727204066.49539: Calling groups_inventory to load vars for managed-node2 8975 1727204066.49541: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204066.49886: Calling all_plugins_play to load vars for managed-node2 8975 1727204066.49891: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204066.49895: Calling groups_plugins_play to load vars for managed-node2 8975 1727204066.50523: done sending task result for task 127b8e07-fff9-9356-306d-0000000000c2 8975 1727204066.50528: WORKER PROCESS EXITING 8975 1727204066.63185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204066.65304: done with get_vars() 8975 1727204066.65342: done getting variables 8975 1727204066.65402: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.582) 0:00:37.971 ***** 8975 1727204066.65434: entering _queue_task() for managed-node2/shell 8975 1727204066.65823: worker is 1 (out of 1 available) 8975 1727204066.65839: exiting _queue_task() for managed-node2/shell 8975 1727204066.65852: done queuing things up, now waiting for results queue to drain 8975 1727204066.65855: waiting for pending results... 8975 1727204066.66170: running TaskExecutor() for managed-node2/TASK: Remove test interfaces 8975 1727204066.66525: in run() - task 127b8e07-fff9-9356-306d-0000000000c6 8975 1727204066.66541: variable 'ansible_search_path' from source: unknown 8975 1727204066.66546: variable 'ansible_search_path' from source: unknown 8975 1727204066.66699: calling self._execute() 8975 1727204066.67024: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.67032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.67045: variable 'omit' from source: magic vars 8975 1727204066.67661: variable 'ansible_distribution_major_version' from source: facts 8975 1727204066.67668: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204066.67683: variable 'omit' from source: magic vars 8975 1727204066.67746: variable 'omit' from source: magic vars 8975 1727204066.67920: variable 'dhcp_interface1' from source: play vars 8975 1727204066.67924: variable 'dhcp_interface2' from source: play vars 8975 1727204066.67949: variable 'omit' from source: magic vars 8975 1727204066.67996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204066.68046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204066.68061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204066.68081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204066.68093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204066.68265: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204066.68271: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.68274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.68277: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204066.68281: Set connection var ansible_connection to ssh 8975 1727204066.68283: Set connection var ansible_shell_executable to /bin/sh 8975 1727204066.68286: Set connection var ansible_timeout to 10 8975 1727204066.68289: Set connection var ansible_shell_type to sh 8975 1727204066.68352: Set connection var ansible_pipelining to False 8975 1727204066.68356: variable 'ansible_shell_executable' from source: unknown 8975 1727204066.68359: variable 'ansible_connection' from source: unknown 8975 1727204066.68361: variable 'ansible_module_compression' from source: unknown 8975 1727204066.68364: variable 'ansible_shell_type' from source: unknown 8975 1727204066.68372: variable 'ansible_shell_executable' from source: unknown 8975 1727204066.68375: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204066.68377: variable 'ansible_pipelining' from source: unknown 8975 1727204066.68380: variable 'ansible_timeout' from source: unknown 8975 1727204066.68383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204066.68517: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204066.68526: variable 'omit' from source: magic vars 8975 1727204066.68532: starting attempt loop 8975 1727204066.68536: running the handler 8975 1727204066.68544: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204066.68575: _low_level_execute_command(): starting 8975 1727204066.68591: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204066.69617: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.69625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.69741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.71419: stdout chunk (state=3): >>>/root <<< 8975 1727204066.71640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.71644: stdout chunk (state=3): >>><<< 8975 1727204066.71648: stderr chunk (state=3): >>><<< 8975 1727204066.71651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.71654: _low_level_execute_command(): starting 8975 1727204066.71658: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772 `" && echo ansible-tmp-1727204066.7162838-11768-267042818111772="` echo /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772 `" ) && sleep 0' 8975 1727204066.72420: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 8975 1727204066.72548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.72600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.72680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.74697: stdout chunk (state=3): >>>ansible-tmp-1727204066.7162838-11768-267042818111772=/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772 <<< 8975 1727204066.74886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.74890: stdout chunk (state=3): >>><<< 8975 1727204066.74893: stderr chunk (state=3): >>><<< 8975 1727204066.74935: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.7162838-11768-267042818111772=/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.74976: variable 'ansible_module_compression' from source: unknown 8975 1727204066.75040: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204066.75094: variable 'ansible_facts' from source: unknown 8975 1727204066.75178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py 8975 1727204066.75671: Sending initial data 8975 1727204066.75674: Sent initial data (155 bytes) 8975 1727204066.76081: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.76110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204066.76133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.76171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.76277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.77853: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204066.77918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204066.77985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpdas87kd9 /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py <<< 8975 1727204066.77997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py" <<< 8975 1727204066.78056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpdas87kd9" to remote "/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py" <<< 8975 1727204066.78718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.78862: stderr chunk (state=3): >>><<< 8975 1727204066.78872: stdout chunk (state=3): >>><<< 8975 1727204066.78875: done transferring module to remote 8975 1727204066.78877: _low_level_execute_command(): starting 8975 1727204066.78880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/ /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py && sleep 0' 8975 1727204066.79474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204066.79559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.79577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.79676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204066.81524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204066.81590: stderr chunk (state=3): >>><<< 8975 1727204066.81594: stdout chunk (state=3): >>><<< 8975 1727204066.81608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204066.81613: _low_level_execute_command(): starting 8975 1727204066.81616: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/AnsiballZ_command.py && sleep 0' 8975 1727204066.82126: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204066.82133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204066.82136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.82138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204066.82141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204066.82198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204066.82202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204066.82211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204066.82295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.03719: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:26.990143", "end": "2024-09-24 14:54:27.035610", "delta": "0:00:00.045467", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204067.05433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204067.05481: stderr chunk (state=3): >>><<< 8975 1727204067.05485: stdout chunk (state=3): >>><<< 8975 1727204067.05500: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:26.990143", "end": "2024-09-24 14:54:27.035610", "delta": "0:00:00.045467", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204067.05542: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204067.05551: _low_level_execute_command(): starting 8975 1727204067.05554: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.7162838-11768-267042818111772/ > /dev/null 2>&1 && sleep 0' 8975 1727204067.06043: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.06047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.06050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.06052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204067.06055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.06111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.06115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.06117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.06196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.14320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.14381: stderr chunk (state=3): >>><<< 8975 1727204067.14385: stdout chunk (state=3): >>><<< 8975 1727204067.14398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.14405: handler run complete 8975 1727204067.14424: Evaluated conditional (False): False 8975 1727204067.14440: attempt loop complete, returning result 8975 1727204067.14444: _execute() done 8975 1727204067.14446: dumping result to json 8975 1727204067.14449: done dumping result, returning 8975 1727204067.14458: done running TaskExecutor() for managed-node2/TASK: Remove test interfaces [127b8e07-fff9-9356-306d-0000000000c6] 8975 1727204067.14464: sending task result for task 127b8e07-fff9-9356-306d-0000000000c6 8975 1727204067.14576: done sending task result for task 127b8e07-fff9-9356-306d-0000000000c6 8975 1727204067.14579: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.045467", "end": "2024-09-24 14:54:27.035610", "rc": 0, "start": "2024-09-24 14:54:26.990143" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 8975 1727204067.14648: no more pending results, returning what we have 8975 1727204067.14652: results queue empty 8975 1727204067.14653: checking for any_errors_fatal 8975 1727204067.14673: done checking for any_errors_fatal 8975 1727204067.14674: checking for max_fail_percentage 8975 1727204067.14676: done checking for max_fail_percentage 8975 1727204067.14677: checking to see if all hosts have failed and the running result is not ok 8975 1727204067.14678: done checking to see if all hosts have failed 8975 1727204067.14678: getting the remaining hosts for this loop 8975 1727204067.14680: done getting the remaining hosts for this loop 8975 1727204067.14684: getting the next task for host managed-node2 8975 1727204067.14698: done getting next task for host managed-node2 8975 1727204067.14701: ^ task is: TASK: Stop dnsmasq/radvd services 8975 1727204067.14705: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204067.14709: getting variables 8975 1727204067.14711: in VariableManager get_vars() 8975 1727204067.14752: Calling all_inventory to load vars for managed-node2 8975 1727204067.14755: Calling groups_inventory to load vars for managed-node2 8975 1727204067.14757: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204067.14771: Calling all_plugins_play to load vars for managed-node2 8975 1727204067.14773: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204067.14776: Calling groups_plugins_play to load vars for managed-node2 8975 1727204067.15868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204067.17050: done with get_vars() 8975 1727204067.17076: done getting variables 8975 1727204067.17125: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.517) 0:00:38.488 ***** 8975 1727204067.17154: entering _queue_task() for managed-node2/shell 8975 1727204067.17443: worker is 1 (out of 1 available) 8975 1727204067.17458: exiting _queue_task() for managed-node2/shell 8975 1727204067.17473: done queuing things up, now waiting for results queue to drain 8975 1727204067.17475: waiting for pending results... 8975 1727204067.17674: running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services 8975 1727204067.17777: in run() - task 127b8e07-fff9-9356-306d-0000000000c7 8975 1727204067.17791: variable 'ansible_search_path' from source: unknown 8975 1727204067.17795: variable 'ansible_search_path' from source: unknown 8975 1727204067.17829: calling self._execute() 8975 1727204067.17914: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.17919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.17933: variable 'omit' from source: magic vars 8975 1727204067.18232: variable 'ansible_distribution_major_version' from source: facts 8975 1727204067.18245: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204067.18252: variable 'omit' from source: magic vars 8975 1727204067.18295: variable 'omit' from source: magic vars 8975 1727204067.18325: variable 'omit' from source: magic vars 8975 1727204067.18368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204067.18408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204067.18429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204067.18444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204067.18455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204067.18488: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204067.18492: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.18494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.18574: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204067.18579: Set connection var ansible_connection to ssh 8975 1727204067.18582: Set connection var ansible_shell_executable to /bin/sh 8975 1727204067.18590: Set connection var ansible_timeout to 10 8975 1727204067.18592: Set connection var ansible_shell_type to sh 8975 1727204067.18603: Set connection var ansible_pipelining to False 8975 1727204067.18621: variable 'ansible_shell_executable' from source: unknown 8975 1727204067.18624: variable 'ansible_connection' from source: unknown 8975 1727204067.18631: variable 'ansible_module_compression' from source: unknown 8975 1727204067.18633: variable 'ansible_shell_type' from source: unknown 8975 1727204067.18636: variable 'ansible_shell_executable' from source: unknown 8975 1727204067.18638: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.18641: variable 'ansible_pipelining' from source: unknown 8975 1727204067.18644: variable 'ansible_timeout' from source: unknown 8975 1727204067.18646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.18762: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204067.18774: variable 'omit' from source: magic vars 8975 1727204067.18779: starting attempt loop 8975 1727204067.18782: running the handler 8975 1727204067.18791: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204067.18808: _low_level_execute_command(): starting 8975 1727204067.18816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204067.19390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.19394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.19398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204067.19401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.19449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.19452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.19534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.21208: stdout chunk (state=3): >>>/root <<< 8975 1727204067.21313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.21381: stderr chunk (state=3): >>><<< 8975 1727204067.21388: stdout chunk (state=3): >>><<< 8975 1727204067.21410: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.21421: _low_level_execute_command(): starting 8975 1727204067.21432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812 `" && echo ansible-tmp-1727204067.2140913-11793-221776796505812="` echo /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812 `" ) && sleep 0' 8975 1727204067.21941: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.21945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.21956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.21958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204067.21960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.22015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.22018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.22023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.22095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.24069: stdout chunk (state=3): >>>ansible-tmp-1727204067.2140913-11793-221776796505812=/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812 <<< 8975 1727204067.24182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.24247: stderr chunk (state=3): >>><<< 8975 1727204067.24251: stdout chunk (state=3): >>><<< 8975 1727204067.24273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204067.2140913-11793-221776796505812=/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.24301: variable 'ansible_module_compression' from source: unknown 8975 1727204067.24349: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204067.24388: variable 'ansible_facts' from source: unknown 8975 1727204067.24443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py 8975 1727204067.24562: Sending initial data 8975 1727204067.24568: Sent initial data (155 bytes) 8975 1727204067.25070: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.25074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.25077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.25079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.25139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.25142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.25147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.25216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.26817: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204067.26881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204067.26951: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpzdbd9_ib /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py <<< 8975 1727204067.26958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py" <<< 8975 1727204067.27018: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpzdbd9_ib" to remote "/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py" <<< 8975 1727204067.27025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py" <<< 8975 1727204067.27672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.27750: stderr chunk (state=3): >>><<< 8975 1727204067.27753: stdout chunk (state=3): >>><<< 8975 1727204067.27776: done transferring module to remote 8975 1727204067.27787: _low_level_execute_command(): starting 8975 1727204067.27792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/ /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py && sleep 0' 8975 1727204067.28300: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.28304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204067.28306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.28313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.28371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.28378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.28380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.28448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.30352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.30370: stderr chunk (state=3): >>><<< 8975 1727204067.30384: stdout chunk (state=3): >>><<< 8975 1727204067.30408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.30419: _low_level_execute_command(): starting 8975 1727204067.30429: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/AnsiballZ_command.py && sleep 0' 8975 1727204067.31077: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204067.31094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.31107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.31130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204067.31150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204067.31162: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204067.31209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.31259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.31285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.31389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.50964: stdout chunk (state=3): >>> <<< 8975 1727204067.51021: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:27.478450", "end": "2024-09-24 14:54:27.507759", "delta": "0:00:00.029309", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204067.52522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.52536: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 8975 1727204067.52644: stderr chunk (state=3): >>><<< 8975 1727204067.52654: stdout chunk (state=3): >>><<< 8975 1727204067.52687: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:27.478450", "end": "2024-09-24 14:54:27.507759", "delta": "0:00:00.029309", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204067.52746: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204067.52767: _low_level_execute_command(): starting 8975 1727204067.52777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204067.2140913-11793-221776796505812/ > /dev/null 2>&1 && sleep 0' 8975 1727204067.53464: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204067.53483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.53496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.53557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.53620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.53641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.53671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.53776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.55790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.55794: stdout chunk (state=3): >>><<< 8975 1727204067.55797: stderr chunk (state=3): >>><<< 8975 1727204067.56021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.56025: handler run complete 8975 1727204067.56027: Evaluated conditional (False): False 8975 1727204067.56029: attempt loop complete, returning result 8975 1727204067.56032: _execute() done 8975 1727204067.56035: dumping result to json 8975 1727204067.56037: done dumping result, returning 8975 1727204067.56040: done running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services [127b8e07-fff9-9356-306d-0000000000c7] 8975 1727204067.56042: sending task result for task 127b8e07-fff9-9356-306d-0000000000c7 8975 1727204067.56142: done sending task result for task 127b8e07-fff9-9356-306d-0000000000c7 8975 1727204067.56247: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.029309", "end": "2024-09-24 14:54:27.507759", "rc": 0, "start": "2024-09-24 14:54:27.478450" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 8975 1727204067.56329: no more pending results, returning what we have 8975 1727204067.56334: results queue empty 8975 1727204067.56334: checking for any_errors_fatal 8975 1727204067.56345: done checking for any_errors_fatal 8975 1727204067.56345: checking for max_fail_percentage 8975 1727204067.56347: done checking for max_fail_percentage 8975 1727204067.56348: checking to see if all hosts have failed and the running result is not ok 8975 1727204067.56350: done checking to see if all hosts have failed 8975 1727204067.56351: getting the remaining hosts for this loop 8975 1727204067.56353: done getting the remaining hosts for this loop 8975 1727204067.56359: getting the next task for host managed-node2 8975 1727204067.56372: done getting next task for host managed-node2 8975 1727204067.56375: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 8975 1727204067.56377: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204067.56382: getting variables 8975 1727204067.56384: in VariableManager get_vars() 8975 1727204067.56437: Calling all_inventory to load vars for managed-node2 8975 1727204067.56441: Calling groups_inventory to load vars for managed-node2 8975 1727204067.56443: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204067.56457: Calling all_plugins_play to load vars for managed-node2 8975 1727204067.56461: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204067.56651: Calling groups_plugins_play to load vars for managed-node2 8975 1727204067.57716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204067.59225: done with get_vars() 8975 1727204067.59267: done getting variables 8975 1727204067.59336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.422) 0:00:38.910 ***** 8975 1727204067.59372: entering _queue_task() for managed-node2/command 8975 1727204067.59943: worker is 1 (out of 1 available) 8975 1727204067.59956: exiting _queue_task() for managed-node2/command 8975 1727204067.59970: done queuing things up, now waiting for results queue to drain 8975 1727204067.59972: waiting for pending results... 8975 1727204067.60060: running TaskExecutor() for managed-node2/TASK: Restore the /etc/resolv.conf for initscript 8975 1727204067.60157: in run() - task 127b8e07-fff9-9356-306d-0000000000c8 8975 1727204067.60171: variable 'ansible_search_path' from source: unknown 8975 1727204067.60205: calling self._execute() 8975 1727204067.60293: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.60298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.60311: variable 'omit' from source: magic vars 8975 1727204067.60631: variable 'ansible_distribution_major_version' from source: facts 8975 1727204067.60643: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204067.60735: variable 'network_provider' from source: set_fact 8975 1727204067.60742: Evaluated conditional (network_provider == "initscripts"): False 8975 1727204067.60745: when evaluation is False, skipping this task 8975 1727204067.60748: _execute() done 8975 1727204067.60750: dumping result to json 8975 1727204067.60753: done dumping result, returning 8975 1727204067.60761: done running TaskExecutor() for managed-node2/TASK: Restore the /etc/resolv.conf for initscript [127b8e07-fff9-9356-306d-0000000000c8] 8975 1727204067.60771: sending task result for task 127b8e07-fff9-9356-306d-0000000000c8 8975 1727204067.60873: done sending task result for task 127b8e07-fff9-9356-306d-0000000000c8 8975 1727204067.60877: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 8975 1727204067.60928: no more pending results, returning what we have 8975 1727204067.60932: results queue empty 8975 1727204067.60933: checking for any_errors_fatal 8975 1727204067.60946: done checking for any_errors_fatal 8975 1727204067.60946: checking for max_fail_percentage 8975 1727204067.60948: done checking for max_fail_percentage 8975 1727204067.60949: checking to see if all hosts have failed and the running result is not ok 8975 1727204067.60950: done checking to see if all hosts have failed 8975 1727204067.60951: getting the remaining hosts for this loop 8975 1727204067.60953: done getting the remaining hosts for this loop 8975 1727204067.60957: getting the next task for host managed-node2 8975 1727204067.60969: done getting next task for host managed-node2 8975 1727204067.60972: ^ task is: TASK: Verify network state restored to default 8975 1727204067.60975: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204067.60979: getting variables 8975 1727204067.60981: in VariableManager get_vars() 8975 1727204067.61027: Calling all_inventory to load vars for managed-node2 8975 1727204067.61030: Calling groups_inventory to load vars for managed-node2 8975 1727204067.61032: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204067.61043: Calling all_plugins_play to load vars for managed-node2 8975 1727204067.61046: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204067.61048: Calling groups_plugins_play to load vars for managed-node2 8975 1727204067.62302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204067.64457: done with get_vars() 8975 1727204067.64495: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.052) 0:00:38.962 ***** 8975 1727204067.64598: entering _queue_task() for managed-node2/include_tasks 8975 1727204067.64988: worker is 1 (out of 1 available) 8975 1727204067.65003: exiting _queue_task() for managed-node2/include_tasks 8975 1727204067.65016: done queuing things up, now waiting for results queue to drain 8975 1727204067.65018: waiting for pending results... 8975 1727204067.65398: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 8975 1727204067.65490: in run() - task 127b8e07-fff9-9356-306d-0000000000c9 8975 1727204067.65517: variable 'ansible_search_path' from source: unknown 8975 1727204067.65563: calling self._execute() 8975 1727204067.65716: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.65720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.65726: variable 'omit' from source: magic vars 8975 1727204067.66151: variable 'ansible_distribution_major_version' from source: facts 8975 1727204067.66257: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204067.66261: _execute() done 8975 1727204067.66266: dumping result to json 8975 1727204067.66270: done dumping result, returning 8975 1727204067.66273: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [127b8e07-fff9-9356-306d-0000000000c9] 8975 1727204067.66275: sending task result for task 127b8e07-fff9-9356-306d-0000000000c9 8975 1727204067.66363: done sending task result for task 127b8e07-fff9-9356-306d-0000000000c9 8975 1727204067.66368: WORKER PROCESS EXITING 8975 1727204067.66400: no more pending results, returning what we have 8975 1727204067.66405: in VariableManager get_vars() 8975 1727204067.66463: Calling all_inventory to load vars for managed-node2 8975 1727204067.66468: Calling groups_inventory to load vars for managed-node2 8975 1727204067.66471: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204067.66489: Calling all_plugins_play to load vars for managed-node2 8975 1727204067.66492: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204067.66496: Calling groups_plugins_play to load vars for managed-node2 8975 1727204067.68623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204067.70743: done with get_vars() 8975 1727204067.70778: variable 'ansible_search_path' from source: unknown 8975 1727204067.70795: we have included files to process 8975 1727204067.70797: generating all_blocks data 8975 1727204067.70802: done generating all_blocks data 8975 1727204067.70808: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8975 1727204067.70810: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8975 1727204067.70813: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 8975 1727204067.71269: done processing included file 8975 1727204067.71272: iterating over new_blocks loaded from include file 8975 1727204067.71274: in VariableManager get_vars() 8975 1727204067.71296: done with get_vars() 8975 1727204067.71298: filtering new block on tags 8975 1727204067.71336: done filtering new block on tags 8975 1727204067.71339: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 8975 1727204067.71344: extending task lists for all hosts with included blocks 8975 1727204067.72784: done extending task lists 8975 1727204067.72786: done processing included files 8975 1727204067.72787: results queue empty 8975 1727204067.72788: checking for any_errors_fatal 8975 1727204067.72791: done checking for any_errors_fatal 8975 1727204067.72792: checking for max_fail_percentage 8975 1727204067.72794: done checking for max_fail_percentage 8975 1727204067.72795: checking to see if all hosts have failed and the running result is not ok 8975 1727204067.72796: done checking to see if all hosts have failed 8975 1727204067.72796: getting the remaining hosts for this loop 8975 1727204067.72798: done getting the remaining hosts for this loop 8975 1727204067.72801: getting the next task for host managed-node2 8975 1727204067.72805: done getting next task for host managed-node2 8975 1727204067.72808: ^ task is: TASK: Check routes and DNS 8975 1727204067.72812: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204067.72815: getting variables 8975 1727204067.72816: in VariableManager get_vars() 8975 1727204067.72836: Calling all_inventory to load vars for managed-node2 8975 1727204067.72839: Calling groups_inventory to load vars for managed-node2 8975 1727204067.72841: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204067.72849: Calling all_plugins_play to load vars for managed-node2 8975 1727204067.72851: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204067.72854: Calling groups_plugins_play to load vars for managed-node2 8975 1727204067.74408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204067.76639: done with get_vars() 8975 1727204067.76668: done getting variables 8975 1727204067.76720: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.121) 0:00:39.084 ***** 8975 1727204067.76752: entering _queue_task() for managed-node2/shell 8975 1727204067.77141: worker is 1 (out of 1 available) 8975 1727204067.77156: exiting _queue_task() for managed-node2/shell 8975 1727204067.77172: done queuing things up, now waiting for results queue to drain 8975 1727204067.77173: waiting for pending results... 8975 1727204067.77587: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 8975 1727204067.77637: in run() - task 127b8e07-fff9-9356-306d-000000000570 8975 1727204067.77660: variable 'ansible_search_path' from source: unknown 8975 1727204067.77670: variable 'ansible_search_path' from source: unknown 8975 1727204067.77721: calling self._execute() 8975 1727204067.77839: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.77852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.77873: variable 'omit' from source: magic vars 8975 1727204067.78290: variable 'ansible_distribution_major_version' from source: facts 8975 1727204067.78310: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204067.78326: variable 'omit' from source: magic vars 8975 1727204067.78398: variable 'omit' from source: magic vars 8975 1727204067.78419: variable 'omit' from source: magic vars 8975 1727204067.78457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8975 1727204067.78493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8975 1727204067.78512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8975 1727204067.78529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204067.78539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8975 1727204067.78571: variable 'inventory_hostname' from source: host vars for 'managed-node2' 8975 1727204067.78574: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.78578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.78673: Set connection var ansible_module_compression to ZIP_DEFLATED 8975 1727204067.78676: Set connection var ansible_connection to ssh 8975 1727204067.78682: Set connection var ansible_shell_executable to /bin/sh 8975 1727204067.78689: Set connection var ansible_timeout to 10 8975 1727204067.78692: Set connection var ansible_shell_type to sh 8975 1727204067.78703: Set connection var ansible_pipelining to False 8975 1727204067.78722: variable 'ansible_shell_executable' from source: unknown 8975 1727204067.78725: variable 'ansible_connection' from source: unknown 8975 1727204067.78735: variable 'ansible_module_compression' from source: unknown 8975 1727204067.78738: variable 'ansible_shell_type' from source: unknown 8975 1727204067.78741: variable 'ansible_shell_executable' from source: unknown 8975 1727204067.78744: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204067.78746: variable 'ansible_pipelining' from source: unknown 8975 1727204067.78749: variable 'ansible_timeout' from source: unknown 8975 1727204067.78751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204067.78868: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204067.78881: variable 'omit' from source: magic vars 8975 1727204067.78885: starting attempt loop 8975 1727204067.78887: running the handler 8975 1727204067.78896: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8975 1727204067.78953: _low_level_execute_command(): starting 8975 1727204067.78957: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8975 1727204067.79503: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.79508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.79512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 8975 1727204067.79514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.79573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.79578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.79583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.79659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.81455: stdout chunk (state=3): >>>/root <<< 8975 1727204067.81660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.81664: stdout chunk (state=3): >>><<< 8975 1727204067.81670: stderr chunk (state=3): >>><<< 8975 1727204067.81698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.81720: _low_level_execute_command(): starting 8975 1727204067.81769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597 `" && echo ansible-tmp-1727204067.8170605-11811-248226315304597="` echo /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597 `" ) && sleep 0' 8975 1727204067.82278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.82290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.82293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.82296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.82345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.82352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.82425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.84440: stdout chunk (state=3): >>>ansible-tmp-1727204067.8170605-11811-248226315304597=/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597 <<< 8975 1727204067.84517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.84586: stderr chunk (state=3): >>><<< 8975 1727204067.84589: stdout chunk (state=3): >>><<< 8975 1727204067.84602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204067.8170605-11811-248226315304597=/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.84670: variable 'ansible_module_compression' from source: unknown 8975 1727204067.84683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8975j75zz8at/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8975 1727204067.84721: variable 'ansible_facts' from source: unknown 8975 1727204067.84782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py 8975 1727204067.84911: Sending initial data 8975 1727204067.84915: Sent initial data (155 bytes) 8975 1727204067.85370: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.85399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 8975 1727204067.85402: stderr chunk (state=3): >>>debug2: match not found <<< 8975 1727204067.85405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.85407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.85410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.85463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.85469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.85546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.87156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 8975 1727204067.87220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 8975 1727204067.87295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8975j75zz8at/tmpflwq1m72 /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py <<< 8975 1727204067.87298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py" <<< 8975 1727204067.87358: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-8975j75zz8at/tmpflwq1m72" to remote "/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py" <<< 8975 1727204067.87362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py" <<< 8975 1727204067.88008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.88087: stderr chunk (state=3): >>><<< 8975 1727204067.88090: stdout chunk (state=3): >>><<< 8975 1727204067.88109: done transferring module to remote 8975 1727204067.88120: _low_level_execute_command(): starting 8975 1727204067.88125: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/ /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py && sleep 0' 8975 1727204067.88622: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.88625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 8975 1727204067.88628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.88635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.88672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.88686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.88762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204067.90608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204067.90672: stderr chunk (state=3): >>><<< 8975 1727204067.90676: stdout chunk (state=3): >>><<< 8975 1727204067.90690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204067.90694: _low_level_execute_command(): starting 8975 1727204067.90699: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/AnsiballZ_command.py && sleep 0' 8975 1727204067.91199: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 8975 1727204067.91203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.91206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204067.91208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 8975 1727204067.91211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204067.91269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204067.91273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204067.91278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204067.91354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204068.08873: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3447sec preferred_lft 3447sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:28.076943", "end": "2024-09-24 14:54:28.086091", "delta": "0:00:00.009148", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8975 1727204068.10510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 8975 1727204068.10541: stderr chunk (state=3): >>><<< 8975 1727204068.10544: stdout chunk (state=3): >>><<< 8975 1727204068.10570: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3447sec preferred_lft 3447sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:28.076943", "end": "2024-09-24 14:54:28.086091", "delta": "0:00:00.009148", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 8975 1727204068.10721: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 8975 1727204068.10725: _low_level_execute_command(): starting 8975 1727204068.10728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204067.8170605-11811-248226315304597/ > /dev/null 2>&1 && sleep 0' 8975 1727204068.11369: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 8975 1727204068.11384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 8975 1727204068.11451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 8975 1727204068.11511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 8975 1727204068.11540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 8975 1727204068.11587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 8975 1727204068.11658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 8975 1727204068.13875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 8975 1727204068.13880: stderr chunk (state=3): >>><<< 8975 1727204068.13883: stdout chunk (state=3): >>><<< 8975 1727204068.13886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 8975 1727204068.13888: handler run complete 8975 1727204068.13890: Evaluated conditional (False): False 8975 1727204068.13893: attempt loop complete, returning result 8975 1727204068.13895: _execute() done 8975 1727204068.13897: dumping result to json 8975 1727204068.13899: done dumping result, returning 8975 1727204068.13902: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [127b8e07-fff9-9356-306d-000000000570] 8975 1727204068.13904: sending task result for task 127b8e07-fff9-9356-306d-000000000570 8975 1727204068.14275: done sending task result for task 127b8e07-fff9-9356-306d-000000000570 8975 1727204068.14280: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009148", "end": "2024-09-24 14:54:28.086091", "rc": 0, "start": "2024-09-24 14:54:28.076943" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3447sec preferred_lft 3447sec inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 8975 1727204068.14381: no more pending results, returning what we have 8975 1727204068.14385: results queue empty 8975 1727204068.14386: checking for any_errors_fatal 8975 1727204068.14388: done checking for any_errors_fatal 8975 1727204068.14389: checking for max_fail_percentage 8975 1727204068.14395: done checking for max_fail_percentage 8975 1727204068.14396: checking to see if all hosts have failed and the running result is not ok 8975 1727204068.14397: done checking to see if all hosts have failed 8975 1727204068.14398: getting the remaining hosts for this loop 8975 1727204068.14401: done getting the remaining hosts for this loop 8975 1727204068.14405: getting the next task for host managed-node2 8975 1727204068.14415: done getting next task for host managed-node2 8975 1727204068.14418: ^ task is: TASK: Verify DNS and network connectivity 8975 1727204068.14427: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8975 1727204068.14432: getting variables 8975 1727204068.14434: in VariableManager get_vars() 8975 1727204068.14524: Calling all_inventory to load vars for managed-node2 8975 1727204068.14527: Calling groups_inventory to load vars for managed-node2 8975 1727204068.14529: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204068.14543: Calling all_plugins_play to load vars for managed-node2 8975 1727204068.14546: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204068.14549: Calling groups_plugins_play to load vars for managed-node2 8975 1727204068.16480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204068.18834: done with get_vars() 8975 1727204068.18885: done getting variables 8975 1727204068.18963: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.422) 0:00:39.506 ***** 8975 1727204068.19001: entering _queue_task() for managed-node2/shell 8975 1727204068.19588: worker is 1 (out of 1 available) 8975 1727204068.19600: exiting _queue_task() for managed-node2/shell 8975 1727204068.19611: done queuing things up, now waiting for results queue to drain 8975 1727204068.19613: waiting for pending results... 8975 1727204068.19790: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 8975 1727204068.19951: in run() - task 127b8e07-fff9-9356-306d-000000000571 8975 1727204068.19956: variable 'ansible_search_path' from source: unknown 8975 1727204068.19959: variable 'ansible_search_path' from source: unknown 8975 1727204068.19983: calling self._execute() 8975 1727204068.20112: variable 'ansible_host' from source: host vars for 'managed-node2' 8975 1727204068.20124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 8975 1727204068.20141: variable 'omit' from source: magic vars 8975 1727204068.20568: variable 'ansible_distribution_major_version' from source: facts 8975 1727204068.20587: Evaluated conditional (ansible_distribution_major_version != '6'): True 8975 1727204068.20754: variable 'ansible_facts' from source: unknown 8975 1727204068.21802: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 8975 1727204068.21848: when evaluation is False, skipping this task 8975 1727204068.21853: _execute() done 8975 1727204068.21855: dumping result to json 8975 1727204068.21858: done dumping result, returning 8975 1727204068.21860: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [127b8e07-fff9-9356-306d-000000000571] 8975 1727204068.21862: sending task result for task 127b8e07-fff9-9356-306d-000000000571 8975 1727204068.22128: done sending task result for task 127b8e07-fff9-9356-306d-000000000571 8975 1727204068.22132: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 8975 1727204068.22189: no more pending results, returning what we have 8975 1727204068.22192: results queue empty 8975 1727204068.22193: checking for any_errors_fatal 8975 1727204068.22209: done checking for any_errors_fatal 8975 1727204068.22210: checking for max_fail_percentage 8975 1727204068.22211: done checking for max_fail_percentage 8975 1727204068.22213: checking to see if all hosts have failed and the running result is not ok 8975 1727204068.22214: done checking to see if all hosts have failed 8975 1727204068.22215: getting the remaining hosts for this loop 8975 1727204068.22217: done getting the remaining hosts for this loop 8975 1727204068.22221: getting the next task for host managed-node2 8975 1727204068.22349: done getting next task for host managed-node2 8975 1727204068.22353: ^ task is: TASK: meta (flush_handlers) 8975 1727204068.22355: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204068.22361: getting variables 8975 1727204068.22362: in VariableManager get_vars() 8975 1727204068.22415: Calling all_inventory to load vars for managed-node2 8975 1727204068.22419: Calling groups_inventory to load vars for managed-node2 8975 1727204068.22422: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204068.22440: Calling all_plugins_play to load vars for managed-node2 8975 1727204068.22444: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204068.22449: Calling groups_plugins_play to load vars for managed-node2 8975 1727204068.24600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204068.26818: done with get_vars() 8975 1727204068.26853: done getting variables 8975 1727204068.26995: in VariableManager get_vars() 8975 1727204068.27014: Calling all_inventory to load vars for managed-node2 8975 1727204068.27017: Calling groups_inventory to load vars for managed-node2 8975 1727204068.27019: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204068.27025: Calling all_plugins_play to load vars for managed-node2 8975 1727204068.27030: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204068.27034: Calling groups_plugins_play to load vars for managed-node2 8975 1727204068.28657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204068.31152: done with get_vars() 8975 1727204068.31190: done queuing things up, now waiting for results queue to drain 8975 1727204068.31193: results queue empty 8975 1727204068.31194: checking for any_errors_fatal 8975 1727204068.31197: done checking for any_errors_fatal 8975 1727204068.31198: checking for max_fail_percentage 8975 1727204068.31199: done checking for max_fail_percentage 8975 1727204068.31200: checking to see if all hosts have failed and the running result is not ok 8975 1727204068.31201: done checking to see if all hosts have failed 8975 1727204068.31202: getting the remaining hosts for this loop 8975 1727204068.31203: done getting the remaining hosts for this loop 8975 1727204068.31206: getting the next task for host managed-node2 8975 1727204068.31211: done getting next task for host managed-node2 8975 1727204068.31212: ^ task is: TASK: meta (flush_handlers) 8975 1727204068.31214: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204068.31217: getting variables 8975 1727204068.31218: in VariableManager get_vars() 8975 1727204068.31236: Calling all_inventory to load vars for managed-node2 8975 1727204068.31239: Calling groups_inventory to load vars for managed-node2 8975 1727204068.31241: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204068.31248: Calling all_plugins_play to load vars for managed-node2 8975 1727204068.31251: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204068.31254: Calling groups_plugins_play to load vars for managed-node2 8975 1727204068.32816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204068.35014: done with get_vars() 8975 1727204068.35052: done getting variables 8975 1727204068.35112: in VariableManager get_vars() 8975 1727204068.35131: Calling all_inventory to load vars for managed-node2 8975 1727204068.35134: Calling groups_inventory to load vars for managed-node2 8975 1727204068.35136: Calling all_plugins_inventory to load vars for managed-node2 8975 1727204068.35141: Calling all_plugins_play to load vars for managed-node2 8975 1727204068.35143: Calling groups_plugins_inventory to load vars for managed-node2 8975 1727204068.35146: Calling groups_plugins_play to load vars for managed-node2 8975 1727204068.36793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8975 1727204068.38978: done with get_vars() 8975 1727204068.39022: done queuing things up, now waiting for results queue to drain 8975 1727204068.39025: results queue empty 8975 1727204068.39026: checking for any_errors_fatal 8975 1727204068.39030: done checking for any_errors_fatal 8975 1727204068.39030: checking for max_fail_percentage 8975 1727204068.39032: done checking for max_fail_percentage 8975 1727204068.39032: checking to see if all hosts have failed and the running result is not ok 8975 1727204068.39033: done checking to see if all hosts have failed 8975 1727204068.39034: getting the remaining hosts for this loop 8975 1727204068.39035: done getting the remaining hosts for this loop 8975 1727204068.39046: getting the next task for host managed-node2 8975 1727204068.39049: done getting next task for host managed-node2 8975 1727204068.39050: ^ task is: None 8975 1727204068.39052: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8975 1727204068.39053: done queuing things up, now waiting for results queue to drain 8975 1727204068.39054: results queue empty 8975 1727204068.39055: checking for any_errors_fatal 8975 1727204068.39056: done checking for any_errors_fatal 8975 1727204068.39057: checking for max_fail_percentage 8975 1727204068.39058: done checking for max_fail_percentage 8975 1727204068.39058: checking to see if all hosts have failed and the running result is not ok 8975 1727204068.39059: done checking to see if all hosts have failed 8975 1727204068.39061: getting the next task for host managed-node2 8975 1727204068.39064: done getting next task for host managed-node2 8975 1727204068.39066: ^ task is: None 8975 1727204068.39068: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=75 changed=3 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.201) 0:00:39.708 ***** =============================================================================== Install dnsmasq --------------------------------------------------------- 3.42s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.87s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.48s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 2.00s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which packages are installed --- 1.71s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.48s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Install pgrep, sysctl --------------------------------------------------- 1.44s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.27s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.19s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.07s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.96s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.75s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.72s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.63s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Delete the device 'deprecated-bond' ------------------------------------- 0.58s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.58s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.53s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Remove test interfaces -------------------------------------------------- 0.52s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 ** TEST check IPv6 ------------------------------------------------------ 0.51s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 8975 1727204068.39207: RUNNING CLEANUP