10202 1727204037.69395: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 10202 1727204037.69741: Added group all to inventory 10202 1727204037.69742: Added group ungrouped to inventory 10202 1727204037.69745: Group all now contains ungrouped 10202 1727204037.69748: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 10202 1727204037.90848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 10202 1727204037.90899: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 10202 1727204037.90917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 10202 1727204037.90979: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 10202 1727204037.91072: Loaded config def from plugin (inventory/script) 10202 1727204037.91075: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 10202 1727204037.91129: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 10202 1727204037.91233: Loaded config def from plugin (inventory/yaml) 10202 1727204037.91236: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 10202 1727204037.91337: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 10202 1727204037.91888: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 10202 1727204037.91891: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 10202 1727204037.91895: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 10202 1727204037.91901: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 10202 1727204037.91906: Loading data from /tmp/network-jrl/inventory-0Xx.yml 10202 1727204037.91986: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 10202 1727204037.92076: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 10202 1727204037.92125: Loading data from /tmp/network-jrl/inventory-0Xx.yml 10202 1727204037.92229: group all already in inventory 10202 1727204037.92236: set inventory_file for managed-node1 10202 1727204037.92241: set inventory_dir for managed-node1 10202 1727204037.92242: Added host managed-node1 to inventory 10202 1727204037.92244: Added host managed-node1 to group all 10202 1727204037.92245: set ansible_host for managed-node1 10202 1727204037.92246: set ansible_ssh_extra_args for managed-node1 10202 1727204037.92249: set inventory_file for managed-node2 10202 1727204037.92251: set inventory_dir for managed-node2 10202 1727204037.92252: Added host managed-node2 to inventory 10202 1727204037.92254: Added host managed-node2 to group all 10202 1727204037.92255: set ansible_host for managed-node2 10202 1727204037.92255: set ansible_ssh_extra_args for managed-node2 10202 1727204037.92258: set inventory_file for managed-node3 10202 1727204037.92260: set inventory_dir for managed-node3 10202 1727204037.92261: Added host managed-node3 to inventory 10202 1727204037.92262: Added host managed-node3 to group all 10202 1727204037.92263: set ansible_host for managed-node3 10202 1727204037.92263: set ansible_ssh_extra_args for managed-node3 10202 1727204037.92268: Reconcile groups and hosts in inventory. 10202 1727204037.92272: Group ungrouped now contains managed-node1 10202 1727204037.92273: Group ungrouped now contains managed-node2 10202 1727204037.92275: Group ungrouped now contains managed-node3 10202 1727204037.92378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 10202 1727204037.92524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 10202 1727204037.92580: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 10202 1727204037.92610: Loaded config def from plugin (vars/host_group_vars) 10202 1727204037.92613: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 10202 1727204037.92620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 10202 1727204037.92629: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 10202 1727204037.92677: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 10202 1727204037.93027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204037.93124: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 10202 1727204037.93167: Loaded config def from plugin (connection/local) 10202 1727204037.93170: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 10202 1727204037.93985: Loaded config def from plugin (connection/paramiko_ssh) 10202 1727204037.93990: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 10202 1727204037.96435: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10202 1727204037.96485: Loaded config def from plugin (connection/psrp) 10202 1727204037.96490: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 10202 1727204037.97475: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10202 1727204037.97525: Loaded config def from plugin (connection/ssh) 10202 1727204037.97530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 10202 1727204037.99969: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10202 1727204038.00153: Loaded config def from plugin (connection/winrm) 10202 1727204038.00157: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 10202 1727204038.00193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 10202 1727204038.00266: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 10202 1727204038.00346: Loaded config def from plugin (shell/cmd) 10202 1727204038.00348: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 10202 1727204038.00381: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 10202 1727204038.00455: Loaded config def from plugin (shell/powershell) 10202 1727204038.00457: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 10202 1727204038.00518: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 10202 1727204038.00707: Loaded config def from plugin (shell/sh) 10202 1727204038.00710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 10202 1727204038.00749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 10202 1727204038.00890: Loaded config def from plugin (become/runas) 10202 1727204038.00892: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 10202 1727204038.01094: Loaded config def from plugin (become/su) 10202 1727204038.01097: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 10202 1727204038.01271: Loaded config def from plugin (become/sudo) 10202 1727204038.01273: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 10202 1727204038.01314: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 10202 1727204038.01683: in VariableManager get_vars() 10202 1727204038.01707: done with get_vars() 10202 1727204038.01853: trying /usr/local/lib/python3.12/site-packages/ansible/modules 10202 1727204038.07800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 10202 1727204038.08046: in VariableManager get_vars() 10202 1727204038.08052: done with get_vars() 10202 1727204038.08055: variable 'playbook_dir' from source: magic vars 10202 1727204038.08057: variable 'ansible_playbook_python' from source: magic vars 10202 1727204038.08058: variable 'ansible_config_file' from source: magic vars 10202 1727204038.08058: variable 'groups' from source: magic vars 10202 1727204038.08059: variable 'omit' from source: magic vars 10202 1727204038.08060: variable 'ansible_version' from source: magic vars 10202 1727204038.08061: variable 'ansible_check_mode' from source: magic vars 10202 1727204038.08061: variable 'ansible_diff_mode' from source: magic vars 10202 1727204038.08062: variable 'ansible_forks' from source: magic vars 10202 1727204038.08063: variable 'ansible_inventory_sources' from source: magic vars 10202 1727204038.08064: variable 'ansible_skip_tags' from source: magic vars 10202 1727204038.08065: variable 'ansible_limit' from source: magic vars 10202 1727204038.08067: variable 'ansible_run_tags' from source: magic vars 10202 1727204038.08068: variable 'ansible_verbosity' from source: magic vars 10202 1727204038.08153: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 10202 1727204038.08935: in VariableManager get_vars() 10202 1727204038.08954: done with get_vars() 10202 1727204038.08964: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10202 1727204038.10136: in VariableManager get_vars() 10202 1727204038.10154: done with get_vars() 10202 1727204038.10163: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10202 1727204038.10294: in VariableManager get_vars() 10202 1727204038.10312: done with get_vars() 10202 1727204038.10479: in VariableManager get_vars() 10202 1727204038.10494: done with get_vars() 10202 1727204038.10504: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10202 1727204038.10582: in VariableManager get_vars() 10202 1727204038.10599: done with get_vars() 10202 1727204038.10913: in VariableManager get_vars() 10202 1727204038.10928: done with get_vars() 10202 1727204038.10933: variable 'omit' from source: magic vars 10202 1727204038.10952: variable 'omit' from source: magic vars 10202 1727204038.10994: in VariableManager get_vars() 10202 1727204038.11007: done with get_vars() 10202 1727204038.11059: in VariableManager get_vars() 10202 1727204038.11075: done with get_vars() 10202 1727204038.11115: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10202 1727204038.11386: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10202 1727204038.11580: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10202 1727204038.12557: in VariableManager get_vars() 10202 1727204038.12583: done with get_vars() 10202 1727204038.13120: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 10202 1727204038.13352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10202 1727204038.15643: in VariableManager get_vars() 10202 1727204038.15673: done with get_vars() 10202 1727204038.15687: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10202 1727204038.16119: in VariableManager get_vars() 10202 1727204038.16141: done with get_vars() 10202 1727204038.16520: in VariableManager get_vars() 10202 1727204038.16540: done with get_vars() 10202 1727204038.17623: in VariableManager get_vars() 10202 1727204038.17646: done with get_vars() 10202 1727204038.17652: variable 'omit' from source: magic vars 10202 1727204038.17698: variable 'omit' from source: magic vars 10202 1727204038.17743: in VariableManager get_vars() 10202 1727204038.17761: done with get_vars() 10202 1727204038.17790: in VariableManager get_vars() 10202 1727204038.17808: done with get_vars() 10202 1727204038.17845: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10202 1727204038.18214: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10202 1727204038.24136: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10202 1727204038.25130: in VariableManager get_vars() 10202 1727204038.25163: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10202 1727204038.29360: in VariableManager get_vars() 10202 1727204038.29389: done with get_vars() 10202 1727204038.29400: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10202 1727204038.29922: in VariableManager get_vars() 10202 1727204038.29949: done with get_vars() 10202 1727204038.30021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 10202 1727204038.30040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 10202 1727204038.30304: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 10202 1727204038.30494: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 10202 1727204038.30497: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 10202 1727204038.30537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 10202 1727204038.30570: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 10202 1727204038.30769: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 10202 1727204038.30840: Loaded config def from plugin (callback/default) 10202 1727204038.30843: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204038.32149: Loaded config def from plugin (callback/junit) 10202 1727204038.32153: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204038.32211: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 10202 1727204038.32289: Loaded config def from plugin (callback/minimal) 10202 1727204038.32292: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204038.32339: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204038.32409: Loaded config def from plugin (callback/tree) 10202 1727204038.32412: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 10202 1727204038.32564: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 10202 1727204038.32568: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 10202 1727204038.32600: in VariableManager get_vars() 10202 1727204038.32617: done with get_vars() 10202 1727204038.32626: in VariableManager get_vars() 10202 1727204038.32637: done with get_vars() 10202 1727204038.32641: variable 'omit' from source: magic vars 10202 1727204038.32687: in VariableManager get_vars() 10202 1727204038.32703: done with get_vars() 10202 1727204038.32730: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 10202 1727204038.33425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 10202 1727204038.33511: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 10202 1727204038.33575: getting the remaining hosts for this loop 10202 1727204038.33581: done getting the remaining hosts for this loop 10202 1727204038.33585: getting the next task for host managed-node3 10202 1727204038.33589: done getting next task for host managed-node3 10202 1727204038.33591: ^ task is: TASK: Gathering Facts 10202 1727204038.33593: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204038.33596: getting variables 10202 1727204038.33597: in VariableManager get_vars() 10202 1727204038.33610: Calling all_inventory to load vars for managed-node3 10202 1727204038.33613: Calling groups_inventory to load vars for managed-node3 10202 1727204038.33616: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204038.33635: Calling all_plugins_play to load vars for managed-node3 10202 1727204038.33648: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204038.33652: Calling groups_plugins_play to load vars for managed-node3 10202 1727204038.33698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204038.33767: done with get_vars() 10202 1727204038.33775: done getting variables 10202 1727204038.33857: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.014) 0:00:00.014 ***** 10202 1727204038.33884: entering _queue_task() for managed-node3/gather_facts 10202 1727204038.33886: Creating lock for gather_facts 10202 1727204038.34397: worker is 1 (out of 1 available) 10202 1727204038.34410: exiting _queue_task() for managed-node3/gather_facts 10202 1727204038.34425: done queuing things up, now waiting for results queue to drain 10202 1727204038.34427: waiting for pending results... 10202 1727204038.34793: running TaskExecutor() for managed-node3/TASK: Gathering Facts 10202 1727204038.34799: in run() - task 127b8e07-fff9-0b04-2570-0000000000cc 10202 1727204038.34803: variable 'ansible_search_path' from source: unknown 10202 1727204038.34807: calling self._execute() 10202 1727204038.34864: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204038.34888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204038.34905: variable 'omit' from source: magic vars 10202 1727204038.35033: variable 'omit' from source: magic vars 10202 1727204038.35071: variable 'omit' from source: magic vars 10202 1727204038.35123: variable 'omit' from source: magic vars 10202 1727204038.35184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204038.35238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204038.35269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204038.35294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204038.35312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204038.35362: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204038.35373: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204038.35381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204038.35505: Set connection var ansible_shell_type to sh 10202 1727204038.35518: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204038.35528: Set connection var ansible_connection to ssh 10202 1727204038.35547: Set connection var ansible_shell_executable to /bin/sh 10202 1727204038.35560: Set connection var ansible_pipelining to False 10202 1727204038.35649: Set connection var ansible_timeout to 10 10202 1727204038.35653: variable 'ansible_shell_executable' from source: unknown 10202 1727204038.35655: variable 'ansible_connection' from source: unknown 10202 1727204038.35657: variable 'ansible_module_compression' from source: unknown 10202 1727204038.35659: variable 'ansible_shell_type' from source: unknown 10202 1727204038.35661: variable 'ansible_shell_executable' from source: unknown 10202 1727204038.35663: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204038.35667: variable 'ansible_pipelining' from source: unknown 10202 1727204038.35669: variable 'ansible_timeout' from source: unknown 10202 1727204038.35671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204038.35868: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204038.35887: variable 'omit' from source: magic vars 10202 1727204038.35899: starting attempt loop 10202 1727204038.35906: running the handler 10202 1727204038.35925: variable 'ansible_facts' from source: unknown 10202 1727204038.35948: _low_level_execute_command(): starting 10202 1727204038.35960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204038.36777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204038.36798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204038.36816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204038.36837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204038.36856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204038.36878: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204038.36980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.37006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.37125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.38940: stdout chunk (state=3): >>>/root <<< 10202 1727204038.39045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.39125: stderr chunk (state=3): >>><<< 10202 1727204038.39129: stdout chunk (state=3): >>><<< 10202 1727204038.39153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204038.39166: _low_level_execute_command(): starting 10202 1727204038.39174: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389 `" && echo ansible-tmp-1727204038.3915303-10383-221981963256389="` echo /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389 `" ) && sleep 0' 10202 1727204038.39674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204038.39682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.39702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.39739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204038.39743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.39754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.39831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.42016: stdout chunk (state=3): >>>ansible-tmp-1727204038.3915303-10383-221981963256389=/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389 <<< 10202 1727204038.42128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.42199: stderr chunk (state=3): >>><<< 10202 1727204038.42202: stdout chunk (state=3): >>><<< 10202 1727204038.42214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204038.3915303-10383-221981963256389=/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204038.42298: variable 'ansible_module_compression' from source: unknown 10202 1727204038.42302: ANSIBALLZ: Using generic lock for ansible.legacy.setup 10202 1727204038.42305: ANSIBALLZ: Acquiring lock 10202 1727204038.42308: ANSIBALLZ: Lock acquired: 140045305564624 10202 1727204038.42310: ANSIBALLZ: Creating module 10202 1727204038.66816: ANSIBALLZ: Writing module into payload 10202 1727204038.66919: ANSIBALLZ: Writing module 10202 1727204038.66945: ANSIBALLZ: Renaming module 10202 1727204038.66951: ANSIBALLZ: Done creating module 10202 1727204038.66986: variable 'ansible_facts' from source: unknown 10202 1727204038.66990: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204038.66999: _low_level_execute_command(): starting 10202 1727204038.67010: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 10202 1727204038.67810: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.67813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204038.67816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.67819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.68086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.69755: stdout chunk (state=3): >>>PLATFORM <<< 10202 1727204038.69830: stdout chunk (state=3): >>>Linux <<< 10202 1727204038.69860: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 10202 1727204038.69862: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 10202 1727204038.70004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.70073: stderr chunk (state=3): >>><<< 10202 1727204038.70077: stdout chunk (state=3): >>><<< 10202 1727204038.70092: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204038.70103 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 10202 1727204038.70146: _low_level_execute_command(): starting 10202 1727204038.70150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 10202 1727204038.70237: Sending initial data 10202 1727204038.70240: Sent initial data (1181 bytes) 10202 1727204038.70976: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.70980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.70989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.71127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.74910: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 10202 1727204038.75390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.75767: stderr chunk (state=3): >>><<< 10202 1727204038.75770: stdout chunk (state=3): >>><<< 10202 1727204038.75773: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204038.75775: variable 'ansible_facts' from source: unknown 10202 1727204038.75777: variable 'ansible_facts' from source: unknown 10202 1727204038.75779: variable 'ansible_module_compression' from source: unknown 10202 1727204038.75781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10202 1727204038.75830: variable 'ansible_facts' from source: unknown 10202 1727204038.76022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py 10202 1727204038.76298: Sending initial data 10202 1727204038.76301: Sent initial data (154 bytes) 10202 1727204038.76844: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204038.76962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.77045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.77178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.79249: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204038.79363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204038.79439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp37ud5cw1 /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py <<< 10202 1727204038.79443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py" <<< 10202 1727204038.79522: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp37ud5cw1" to remote "/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py" <<< 10202 1727204038.81542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.81697: stderr chunk (state=3): >>><<< 10202 1727204038.81701: stdout chunk (state=3): >>><<< 10202 1727204038.81703: done transferring module to remote 10202 1727204038.81706: _low_level_execute_command(): starting 10202 1727204038.81708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/ /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py && sleep 0' 10202 1727204038.82337: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204038.82354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204038.82373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204038.82427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204038.82445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.82534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204038.82557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.82577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.82682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.84951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204038.84956: stdout chunk (state=3): >>><<< 10202 1727204038.84958: stderr chunk (state=3): >>><<< 10202 1727204038.84960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204038.84963: _low_level_execute_command(): starting 10202 1727204038.84966: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/AnsiballZ_setup.py && sleep 0' 10202 1727204038.85589: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204038.85620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204038.85638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204038.85670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204038.85784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204038.88316: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10202 1727204038.88347: stdout chunk (state=3): >>>import _imp # builtin <<< 10202 1727204038.88387: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 10202 1727204038.88477: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 10202 1727204038.88502: stdout chunk (state=3): >>>import 'posix' # <<< 10202 1727204038.88530: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10202 1727204038.88555: stdout chunk (state=3): >>>import 'time' # <<< 10202 1727204038.88568: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 10202 1727204038.88634: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.88663: stdout chunk (state=3): >>>import '_codecs' # <<< 10202 1727204038.88679: stdout chunk (state=3): >>>import 'codecs' # <<< 10202 1727204038.88740: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10202 1727204038.88759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42cc0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42c8fb30> <<< 10202 1727204038.88784: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 10202 1727204038.88831: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42cc2ab0> import '_signal' # <<< 10202 1727204038.88835: stdout chunk (state=3): >>>import '_abc' # <<< 10202 1727204038.88869: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 10202 1727204038.88902: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10202 1727204038.89003: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10202 1727204038.89037: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10202 1727204038.89110: stdout chunk (state=3): >>>import 'os' # <<< 10202 1727204038.89114: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10202 1727204038.89116: stdout chunk (state=3): >>>Processing user site-packages <<< 10202 1727204038.89121: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 10202 1727204038.89134: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 10202 1727204038.89173: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 10202 1727204038.89199: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42a71190> <<< 10202 1727204038.89284: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10202 1727204038.89302: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42a72090> <<< 10202 1727204038.89322: stdout chunk (state=3): >>>import 'site' # <<< 10202 1727204038.89364: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10202 1727204038.89786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10202 1727204038.89812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.89836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10202 1727204038.89902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10202 1727204038.89905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10202 1727204038.89932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10202 1727204038.89960: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aafec0> <<< 10202 1727204038.89975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10202 1727204038.89986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10202 1727204038.90020: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aaff80> <<< 10202 1727204038.90042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10202 1727204038.90059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10202 1727204038.90092: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10202 1727204038.90143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.90170: stdout chunk (state=3): >>>import 'itertools' # <<< 10202 1727204038.90216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ae7890> <<< 10202 1727204038.90222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 10202 1727204038.90249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ae7f20> import '_collections' # <<< 10202 1727204038.90308: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac7b90> <<< 10202 1727204038.90319: stdout chunk (state=3): >>>import '_functools' # <<< 10202 1727204038.90348: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac52b0> <<< 10202 1727204038.90543: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aad070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10202 1727204038.90574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10202 1727204038.90604: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10202 1727204038.90639: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b0b830> <<< 10202 1727204038.90659: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b0a450> <<< 10202 1727204038.90693: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac6180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b08cb0> <<< 10202 1727204038.90775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aac2f0> <<< 10202 1727204038.90800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10202 1727204038.90826: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b3cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3cbf0> <<< 10202 1727204038.90864: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.90869: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b3cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aaae10> <<< 10202 1727204038.90915: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.90977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 10202 1727204038.91017: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3d370> import 'importlib.machinery' # <<< 10202 1727204038.91025: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 10202 1727204038.91042: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e570> <<< 10202 1727204038.91069: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 10202 1727204038.91109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10202 1727204038.91137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10202 1727204038.91170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b587a0> <<< 10202 1727204038.91191: stdout chunk (state=3): >>>import 'errno' # <<< 10202 1727204038.91213: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.91252: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b59ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10202 1727204038.91286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 10202 1727204038.91290: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5ad80> <<< 10202 1727204038.91356: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b5b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5a2d0> <<< 10202 1727204038.91359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 10202 1727204038.91386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10202 1727204038.91422: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.91440: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b5bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5b530> <<< 10202 1727204038.91495: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e5d0> <<< 10202 1727204038.91506: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10202 1727204038.91540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10202 1727204038.91579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10202 1727204038.91586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10202 1727204038.91681: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42857d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.91685: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42880770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428804d0> <<< 10202 1727204038.91989: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d428807a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42880980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42855eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42882000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42880c80> <<< 10202 1727204038.92004: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e750> <<< 10202 1727204038.92024: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10202 1727204038.92083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.92104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10202 1727204038.92150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10202 1727204038.92178: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ae3c0> <<< 10202 1727204038.92236: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10202 1727204038.92254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.92277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10202 1727204038.92298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10202 1727204038.92347: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c6570> <<< 10202 1727204038.92374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10202 1727204038.92412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10202 1727204038.92480: stdout chunk (state=3): >>>import 'ntpath' # <<< 10202 1727204038.92504: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ff320> <<< 10202 1727204038.92529: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10202 1727204038.92569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10202 1727204038.92590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10202 1727204038.92645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10202 1727204038.92733: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42925ac0> <<< 10202 1727204038.92910: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ff440> <<< 10202 1727204038.92938: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c7200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4273c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c55b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42882f60> <<< 10202 1727204038.93094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10202 1727204038.93116: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5d4273c770> <<< 10202 1727204038.93303: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b9a7hdst/ansible_ansible.legacy.setup_payload.zip' <<< 10202 1727204038.93317: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204038.93473: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204038.93488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10202 1727204038.93545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10202 1727204038.93638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10202 1727204038.93673: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427a6210> import '_typing' # <<< 10202 1727204038.93884: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277d100> <<< 10202 1727204038.93905: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277c2f0> # zipimport: zlib available <<< 10202 1727204038.93934: stdout chunk (state=3): >>>import 'ansible' # <<< 10202 1727204038.93959: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204038.93992: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 10202 1727204038.94006: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204038.95630: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204038.97060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277fec0> <<< 10202 1727204038.97068: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.97129: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10202 1727204038.97133: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10202 1727204038.97153: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427d9b20> <<< 10202 1727204038.97186: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d98e0> <<< 10202 1727204038.97232: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d9250> <<< 10202 1727204038.97273: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10202 1727204038.97303: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d9670> <<< 10202 1727204038.97306: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c53a0> <<< 10202 1727204038.97343: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427da900> <<< 10202 1727204038.97387: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427dab40> <<< 10202 1727204038.97400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10202 1727204038.97463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10202 1727204038.97466: stdout chunk (state=3): >>>import '_locale' # <<< 10202 1727204038.97537: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427daff0> import 'pwd' # <<< 10202 1727204038.97551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10202 1727204038.97592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10202 1727204038.97627: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4263cce0> <<< 10202 1727204038.97659: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4263e900> <<< 10202 1727204038.97680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10202 1727204038.97695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10202 1727204038.97754: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4263f2c0> <<< 10202 1727204038.97757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10202 1727204038.97797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10202 1727204038.97840: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42640470> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10202 1727204038.97883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10202 1727204038.97900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10202 1727204038.97969: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42642f60> <<< 10202 1727204038.98014: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42643080> <<< 10202 1727204038.98039: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42641220> <<< 10202 1727204038.98053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10202 1727204038.98084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10202 1727204038.98108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10202 1727204038.98129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10202 1727204038.98167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10202 1727204038.98219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 10202 1727204038.98222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42646f00> import '_tokenize' # <<< 10202 1727204038.98311: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d426459d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42645730> <<< 10202 1727204038.98333: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10202 1727204038.98425: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42647ad0> <<< 10202 1727204038.98449: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42641730> <<< 10202 1727204038.98485: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4268b050> <<< 10202 1727204038.98520: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4268b200> <<< 10202 1727204038.98567: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10202 1727204038.98571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10202 1727204038.98596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10202 1727204038.98628: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.98655: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42690e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42690bc0> <<< 10202 1727204038.98669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10202 1727204038.98797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10202 1727204038.98858: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42693290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42691400> <<< 10202 1727204038.98880: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10202 1727204038.98936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204038.98969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10202 1727204038.98982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 10202 1727204038.99032: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269aa50> <<< 10202 1727204038.99167: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42693440> <<< 10202 1727204038.99248: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269b890> <<< 10202 1727204038.99287: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269bcb0> <<< 10202 1727204038.99342: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.99358: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4268b500> <<< 10202 1727204038.99385: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10202 1727204038.99403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10202 1727204038.99429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10202 1727204038.99457: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.99488: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269f4d0> <<< 10202 1727204038.99678: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204038.99706: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d426a0620> <<< 10202 1727204038.99709: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269dc40> <<< 10202 1727204038.99743: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269eff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269d8e0> # zipimport: zlib available <<< 10202 1727204038.99759: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10202 1727204038.99778: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204038.99882: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.00001: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204039.00005: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 10202 1727204039.00042: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10202 1727204039.00061: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.00188: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.00319: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.00992: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.01643: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10202 1727204039.01695: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 10202 1727204039.01699: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204039.01769: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42528770> <<< 10202 1727204039.01861: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10202 1727204039.01892: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42529550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269c9b0> <<< 10202 1727204039.01952: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10202 1727204039.01970: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.01995: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.02020: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10202 1727204039.02174: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.02388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42529520> <<< 10202 1727204039.02413: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.02949: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03461: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03544: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03628: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10202 1727204039.03662: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03669: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03715: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 10202 1727204039.03797: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03898: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10202 1727204039.03917: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204039.03942: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 10202 1727204039.03970: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.03985: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.04025: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10202 1727204039.04046: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.04307: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.04578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10202 1727204039.04645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10202 1727204039.04669: stdout chunk (state=3): >>>import '_ast' # <<< 10202 1727204039.04749: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4252a3f0> <<< 10202 1727204039.04752: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.04839: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.04926: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10202 1727204039.04962: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10202 1727204039.05062: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.05191: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d425321b0> <<< 10202 1727204039.05262: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42532ae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4252aed0> <<< 10202 1727204039.05288: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05330: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05373: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10202 1727204039.05388: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05429: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05477: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05543: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.05627: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10202 1727204039.05668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204039.05773: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42531910> <<< 10202 1727204039.05819: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42532c90> <<< 10202 1727204039.06105: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10202 1727204039.06116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10202 1727204039.06147: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10202 1727204039.06150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10202 1727204039.06215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10202 1727204039.06240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10202 1727204039.06250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10202 1727204039.06327: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425caba0> <<< 10202 1727204039.06367: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4253c860> <<< 10202 1727204039.06458: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4253a9f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425317f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10202 1727204039.06480: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06504: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06539: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10202 1727204039.06602: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10202 1727204039.06638: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 10202 1727204039.06654: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06715: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06786: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06797: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06822: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06867: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.06914: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07026: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07057: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 10202 1727204039.07087: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07179: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07190: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07261: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 10202 1727204039.07447: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07651: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07688: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.07753: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204039.07772: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 10202 1727204039.07802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 10202 1727204039.07833: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10202 1727204039.07868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 10202 1727204039.07876: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cd910> <<< 10202 1727204039.07889: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 10202 1727204039.07906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 10202 1727204039.07928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10202 1727204039.07999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10202 1727204039.08003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10202 1727204039.08027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b48200> <<< 10202 1727204039.08074: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.08085: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b48560> <<< 10202 1727204039.08141: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ad280> <<< 10202 1727204039.08173: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ac320> <<< 10202 1727204039.08204: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ccaa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cca40> <<< 10202 1727204039.08221: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10202 1727204039.08317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 10202 1727204039.08338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10202 1727204039.08369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10202 1727204039.08404: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.08438: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b4b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4ae10> <<< 10202 1727204039.08464: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b4aff0> <<< 10202 1727204039.08482: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4a270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10202 1727204039.08630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10202 1727204039.08635: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4b620> <<< 10202 1727204039.08657: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10202 1727204039.08681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10202 1727204039.08716: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41bb6120> <<< 10202 1727204039.08746: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb4140> <<< 10202 1727204039.08784: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cd100> import 'ansible.module_utils.facts.timeout' # <<< 10202 1727204039.08815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 10202 1727204039.08850: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 10202 1727204039.08853: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.08921: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.08987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 10202 1727204039.09003: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09053: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 10202 1727204039.09137: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 10202 1727204039.09157: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09184: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 10202 1727204039.09243: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09278: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 10202 1727204039.09389: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 10202 1727204039.09446: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09501: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09563: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09625: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.09695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 10202 1727204039.09698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 10202 1727204039.09720: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.10293: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.10806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 10202 1727204039.10939: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11225: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204039.11229: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10202 1727204039.11255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 10202 1727204039.11279: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11282: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10202 1727204039.11320: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11331: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 10202 1727204039.11373: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11458: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 10202 1727204039.11579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10202 1727204039.11594: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb5e80> <<< 10202 1727204039.11615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10202 1727204039.11641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10202 1727204039.11777: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb6ed0> import 'ansible.module_utils.facts.system.local' # <<< 10202 1727204039.11796: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11860: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.11945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10202 1727204039.11948: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12062: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 10202 1727204039.12180: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12227: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 10202 1727204039.12330: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12358: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10202 1727204039.12470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10202 1727204039.12554: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.12623: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41be2480> <<< 10202 1727204039.12850: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bcf110> import 'ansible.module_utils.facts.system.python' # <<< 10202 1727204039.12865: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12927: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.12995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 10202 1727204039.12998: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13085: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13179: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13304: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13488: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10202 1727204039.13491: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13527: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 10202 1727204039.13613: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13629: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 10202 1727204039.13691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10202 1727204039.13713: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.13764: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204039.13794: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41bfde80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41be0380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 10202 1727204039.13812: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13850: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.13887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 10202 1727204039.13903: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14083: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 10202 1727204039.14254: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14362: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14472: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14522: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 10202 1727204039.14578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 10202 1727204039.14598: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14619: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14785: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.14953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 10202 1727204039.14977: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.15093: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.15234: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10202 1727204039.15246: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.15268: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.15317: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.15991: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.16607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 10202 1727204039.16635: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.16739: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.16858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 10202 1727204039.16971: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 10202 1727204039.17099: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17253: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10202 1727204039.17476: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 10202 1727204039.17493: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17527: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 10202 1727204039.17694: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.17802: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18037: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10202 1727204039.18288: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18331: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18371: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 10202 1727204039.18397: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 10202 1727204039.18446: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18513: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 10202 1727204039.18621: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 10202 1727204039.18666: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18724: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 10202 1727204039.18858: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.18935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 10202 1727204039.18938: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19249: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 10202 1727204039.19626: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 10202 1727204039.19709: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19734: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 10202 1727204039.19820: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10202 1727204039.19868: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19895: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.19939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 10202 1727204039.20038: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.20134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 10202 1727204039.20164: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 10202 1727204039.20416: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204039.20420: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.20502: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.20572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 10202 1727204039.20594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 10202 1727204039.20652: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.20702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 10202 1727204039.20727: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.20951: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 10202 1727204039.21304: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21345: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 10202 1727204039.21407: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21499: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 10202 1727204039.21620: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21705: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.21799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10202 1727204039.21887: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204039.22091: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 10202 1727204039.22116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 10202 1727204039.22136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 10202 1727204039.22185: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d419a7110> <<< 10202 1727204039.22200: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419a4aa0> <<< 10202 1727204039.22242: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419a56a0> <<< 10202 1727204039.35676: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 10202 1727204039.35681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ef0b0> <<< 10202 1727204039.35684: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 10202 1727204039.35732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ecbc0> <<< 10202 1727204039.35788: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204039.35822: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 10202 1727204039.35844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 10202 1727204039.35881: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ee420> <<< 10202 1727204039.35893: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419eddf0> <<< 10202 1727204039.36196: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 10202 1727204039.59733: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.41259765625, "5m": 0.3759765625, "15m": 0.1826171875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3084, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 632, "free": 3084}, "nocache": {"free": 3492, "used": 224}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_de<<< 10202 1727204039.59767: stdout chunk (state=3): >>>vices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 376, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251335225344, "block_size": 4096, "block_total": 64479564, "block_available": 61361139, "block_used": 3118425, "inode_total": 16384000, "inode_available": 16301595, "inode_used": 82405, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "59", "epoch": "1727204039", "epoch_int": "1727204039", "date": "2024-09-24", "time": "14:53:59", "iso8601_micro": "2024-09-24T18:53:59.593361Z", "iso8601": "2024-09-24T18:53:59Z", "iso8601_basic": "20240924T145359593361", "iso8601_basic_short": "20240924T145359", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10202 1727204039.60517: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10202 1727204039.60563: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 10202 1727204039.60569: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 10202 1727204039.60631: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 10202 1727204039.60639: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize <<< 10202 1727204039.60718: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 10202 1727204039.60731: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 10202 1727204039.60735: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq <<< 10202 1727204039.60797: stdout chunk (state=3): >>># cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips <<< 10202 1727204039.60804: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr <<< 10202 1727204039.60832: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 10202 1727204039.61246: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10202 1727204039.61294: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 10202 1727204039.61326: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10202 1727204039.61340: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 10202 1727204039.61393: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 10202 1727204039.61434: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 10202 1727204039.61461: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 10202 1727204039.61477: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 10202 1727204039.61542: stdout chunk (state=3): >>># destroy _hashlib <<< 10202 1727204039.61556: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 10202 1727204039.61594: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 10202 1727204039.61653: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 10202 1727204039.61656: stdout chunk (state=3): >>># destroy _pickle <<< 10202 1727204039.61670: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 10202 1727204039.61701: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 10202 1727204039.61716: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 10202 1727204039.61733: stdout chunk (state=3): >>># destroy _ssl <<< 10202 1727204039.61772: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 10202 1727204039.61788: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 10202 1727204039.61811: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 10202 1727204039.61846: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 10202 1727204039.61849: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 10202 1727204039.61930: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 10202 1727204039.61956: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 10202 1727204039.62023: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 10202 1727204039.62027: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 10202 1727204039.62077: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 10202 1727204039.62128: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 10202 1727204039.62133: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections <<< 10202 1727204039.62175: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 10202 1727204039.62178: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 10202 1727204039.62197: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10202 1727204039.62413: stdout chunk (state=3): >>># destroy sys.monitoring <<< 10202 1727204039.62432: stdout chunk (state=3): >>># destroy _socket <<< 10202 1727204039.62474: stdout chunk (state=3): >>># destroy _collections <<< 10202 1727204039.62477: stdout chunk (state=3): >>># destroy platform <<< 10202 1727204039.62509: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 10202 1727204039.62547: stdout chunk (state=3): >>># destroy _typing <<< 10202 1727204039.62569: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 10202 1727204039.62612: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 10202 1727204039.62615: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10202 1727204039.62727: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 10202 1727204039.62761: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 10202 1727204039.62817: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 10202 1727204039.62843: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 10202 1727204039.62854: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 10202 1727204039.63473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204039.63477: stdout chunk (state=3): >>><<< 10202 1727204039.63480: stderr chunk (state=3): >>><<< 10202 1727204039.63775: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42cc0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42c8fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42cc2ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42a71190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42a72090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aafec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aaff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ae7890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ae7f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac7b90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac52b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aad070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b0b830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b0a450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42ac6180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b08cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aac2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b3cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b3cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42aaae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b587a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b59ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b5b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42b5bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b5b530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42857d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42880770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428804d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d428807a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42880980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42855eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42882000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42880c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42b3e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ae3c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c6570> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ff320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42925ac0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428ff440> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c7200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4273c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c55b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42882f60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5d4273c770> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b9a7hdst/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427a6210> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277d100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277c2f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4277fec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427d9b20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d98e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d9250> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427d9670> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d428c53a0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427da900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d427dab40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d427daff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4263cce0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4263e900> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4263f2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42640470> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42642f60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42643080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42641220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42646f00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d426459d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42645730> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42647ad0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42641730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4268b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4268b200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42690e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42690bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42693290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42691400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269aa50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42693440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269b890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269bcb0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4268b500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269f4d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d426a0620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269dc40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d4269eff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269d8e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42528770> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42529550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4269c9b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42529520> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4252a3f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d425321b0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42532ae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4252aed0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d42531910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d42532c90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425caba0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4253c860> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d4253a9f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425317f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cd910> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b48200> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b48560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ad280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ac320> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425ccaa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cca40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b4b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4ae10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41b4aff0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4a270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41b4b620> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41bb6120> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb4140> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d425cd100> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb5e80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bb6ed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41be2480> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41bcf110> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d41bfde80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d41be0380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5d419a7110> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419a4aa0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419a56a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ef0b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ecbc0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419ee420> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5d419eddf0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.41259765625, "5m": 0.3759765625, "15m": 0.1826171875}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3084, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 632, "free": 3084}, "nocache": {"free": 3492, "used": 224}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 376, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251335225344, "block_size": 4096, "block_total": 64479564, "block_available": 61361139, "block_used": 3118425, "inode_total": 16384000, "inode_available": 16301595, "inode_used": 82405, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "59", "epoch": "1727204039", "epoch_int": "1727204039", "date": "2024-09-24", "time": "14:53:59", "iso8601_micro": "2024-09-24T18:53:59.593361Z", "iso8601": "2024-09-24T18:53:59Z", "iso8601_basic": "20240924T145359593361", "iso8601_basic_short": "20240924T145359", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 10202 1727204039.66173: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204039.66218: _low_level_execute_command(): starting 10202 1727204039.66230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204038.3915303-10383-221981963256389/ > /dev/null 2>&1 && sleep 0' 10202 1727204039.66961: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204039.67023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204039.67103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204039.67152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204039.67156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204039.67275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204039.69432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204039.69436: stdout chunk (state=3): >>><<< 10202 1727204039.69439: stderr chunk (state=3): >>><<< 10202 1727204039.69467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204039.69483: handler run complete 10202 1727204039.69610: variable 'ansible_facts' from source: unknown 10202 1727204039.69727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.70046: variable 'ansible_facts' from source: unknown 10202 1727204039.70146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.70383: attempt loop complete, returning result 10202 1727204039.70387: _execute() done 10202 1727204039.70389: dumping result to json 10202 1727204039.70392: done dumping result, returning 10202 1727204039.70394: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-0b04-2570-0000000000cc] 10202 1727204039.70396: sending task result for task 127b8e07-fff9-0b04-2570-0000000000cc 10202 1727204039.70823: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000cc 10202 1727204039.70826: WORKER PROCESS EXITING ok: [managed-node3] 10202 1727204039.71564: no more pending results, returning what we have 10202 1727204039.71571: results queue empty 10202 1727204039.71572: checking for any_errors_fatal 10202 1727204039.71573: done checking for any_errors_fatal 10202 1727204039.71574: checking for max_fail_percentage 10202 1727204039.71576: done checking for max_fail_percentage 10202 1727204039.71577: checking to see if all hosts have failed and the running result is not ok 10202 1727204039.71578: done checking to see if all hosts have failed 10202 1727204039.71579: getting the remaining hosts for this loop 10202 1727204039.71580: done getting the remaining hosts for this loop 10202 1727204039.71584: getting the next task for host managed-node3 10202 1727204039.71591: done getting next task for host managed-node3 10202 1727204039.71593: ^ task is: TASK: meta (flush_handlers) 10202 1727204039.71596: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204039.71600: getting variables 10202 1727204039.71601: in VariableManager get_vars() 10202 1727204039.71627: Calling all_inventory to load vars for managed-node3 10202 1727204039.71630: Calling groups_inventory to load vars for managed-node3 10202 1727204039.71633: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204039.71644: Calling all_plugins_play to load vars for managed-node3 10202 1727204039.71647: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204039.71650: Calling groups_plugins_play to load vars for managed-node3 10202 1727204039.71833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.72021: done with get_vars() 10202 1727204039.72035: done getting variables 10202 1727204039.72111: in VariableManager get_vars() 10202 1727204039.72123: Calling all_inventory to load vars for managed-node3 10202 1727204039.72125: Calling groups_inventory to load vars for managed-node3 10202 1727204039.72128: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204039.72134: Calling all_plugins_play to load vars for managed-node3 10202 1727204039.72136: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204039.72140: Calling groups_plugins_play to load vars for managed-node3 10202 1727204039.72299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.72490: done with get_vars() 10202 1727204039.72505: done queuing things up, now waiting for results queue to drain 10202 1727204039.72507: results queue empty 10202 1727204039.72508: checking for any_errors_fatal 10202 1727204039.72511: done checking for any_errors_fatal 10202 1727204039.72512: checking for max_fail_percentage 10202 1727204039.72513: done checking for max_fail_percentage 10202 1727204039.72514: checking to see if all hosts have failed and the running result is not ok 10202 1727204039.72522: done checking to see if all hosts have failed 10202 1727204039.72522: getting the remaining hosts for this loop 10202 1727204039.72523: done getting the remaining hosts for this loop 10202 1727204039.72527: getting the next task for host managed-node3 10202 1727204039.72532: done getting next task for host managed-node3 10202 1727204039.72534: ^ task is: TASK: Include the task 'el_repo_setup.yml' 10202 1727204039.72536: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204039.72538: getting variables 10202 1727204039.72539: in VariableManager get_vars() 10202 1727204039.72549: Calling all_inventory to load vars for managed-node3 10202 1727204039.72551: Calling groups_inventory to load vars for managed-node3 10202 1727204039.72553: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204039.72559: Calling all_plugins_play to load vars for managed-node3 10202 1727204039.72561: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204039.72564: Calling groups_plugins_play to load vars for managed-node3 10202 1727204039.72711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.72905: done with get_vars() 10202 1727204039.72915: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Tuesday 24 September 2024 14:53:59 -0400 (0:00:01.391) 0:00:01.405 ***** 10202 1727204039.73000: entering _queue_task() for managed-node3/include_tasks 10202 1727204039.73002: Creating lock for include_tasks 10202 1727204039.73347: worker is 1 (out of 1 available) 10202 1727204039.73362: exiting _queue_task() for managed-node3/include_tasks 10202 1727204039.73381: done queuing things up, now waiting for results queue to drain 10202 1727204039.73383: waiting for pending results... 10202 1727204039.73595: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 10202 1727204039.73707: in run() - task 127b8e07-fff9-0b04-2570-000000000006 10202 1727204039.73733: variable 'ansible_search_path' from source: unknown 10202 1727204039.73779: calling self._execute() 10202 1727204039.73870: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204039.73884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204039.73920: variable 'omit' from source: magic vars 10202 1727204039.74071: _execute() done 10202 1727204039.74074: dumping result to json 10202 1727204039.74077: done dumping result, returning 10202 1727204039.74080: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-0b04-2570-000000000006] 10202 1727204039.74088: sending task result for task 127b8e07-fff9-0b04-2570-000000000006 10202 1727204039.74317: done sending task result for task 127b8e07-fff9-0b04-2570-000000000006 10202 1727204039.74320: WORKER PROCESS EXITING 10202 1727204039.74380: no more pending results, returning what we have 10202 1727204039.74386: in VariableManager get_vars() 10202 1727204039.74422: Calling all_inventory to load vars for managed-node3 10202 1727204039.74425: Calling groups_inventory to load vars for managed-node3 10202 1727204039.74429: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204039.74442: Calling all_plugins_play to load vars for managed-node3 10202 1727204039.74445: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204039.74449: Calling groups_plugins_play to load vars for managed-node3 10202 1727204039.74650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.74836: done with get_vars() 10202 1727204039.74845: variable 'ansible_search_path' from source: unknown 10202 1727204039.74861: we have included files to process 10202 1727204039.74862: generating all_blocks data 10202 1727204039.74864: done generating all_blocks data 10202 1727204039.74867: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10202 1727204039.74868: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10202 1727204039.74871: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10202 1727204039.75744: in VariableManager get_vars() 10202 1727204039.75762: done with get_vars() 10202 1727204039.75778: done processing included file 10202 1727204039.75780: iterating over new_blocks loaded from include file 10202 1727204039.75782: in VariableManager get_vars() 10202 1727204039.75792: done with get_vars() 10202 1727204039.75794: filtering new block on tags 10202 1727204039.75810: done filtering new block on tags 10202 1727204039.75813: in VariableManager get_vars() 10202 1727204039.75824: done with get_vars() 10202 1727204039.75826: filtering new block on tags 10202 1727204039.75844: done filtering new block on tags 10202 1727204039.75847: in VariableManager get_vars() 10202 1727204039.76094: done with get_vars() 10202 1727204039.76097: filtering new block on tags 10202 1727204039.76113: done filtering new block on tags 10202 1727204039.76115: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 10202 1727204039.76122: extending task lists for all hosts with included blocks 10202 1727204039.76196: done extending task lists 10202 1727204039.76198: done processing included files 10202 1727204039.76199: results queue empty 10202 1727204039.76200: checking for any_errors_fatal 10202 1727204039.76201: done checking for any_errors_fatal 10202 1727204039.76202: checking for max_fail_percentage 10202 1727204039.76203: done checking for max_fail_percentage 10202 1727204039.76204: checking to see if all hosts have failed and the running result is not ok 10202 1727204039.76205: done checking to see if all hosts have failed 10202 1727204039.76206: getting the remaining hosts for this loop 10202 1727204039.76208: done getting the remaining hosts for this loop 10202 1727204039.76210: getting the next task for host managed-node3 10202 1727204039.76215: done getting next task for host managed-node3 10202 1727204039.76218: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 10202 1727204039.76220: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204039.76222: getting variables 10202 1727204039.76223: in VariableManager get_vars() 10202 1727204039.76233: Calling all_inventory to load vars for managed-node3 10202 1727204039.76235: Calling groups_inventory to load vars for managed-node3 10202 1727204039.76238: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204039.76244: Calling all_plugins_play to load vars for managed-node3 10202 1727204039.76247: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204039.76250: Calling groups_plugins_play to load vars for managed-node3 10202 1727204039.76399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204039.76588: done with get_vars() 10202 1727204039.76597: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:59 -0400 (0:00:00.036) 0:00:01.442 ***** 10202 1727204039.76676: entering _queue_task() for managed-node3/setup 10202 1727204039.77195: worker is 1 (out of 1 available) 10202 1727204039.77207: exiting _queue_task() for managed-node3/setup 10202 1727204039.77221: done queuing things up, now waiting for results queue to drain 10202 1727204039.77222: waiting for pending results... 10202 1727204039.77664: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 10202 1727204039.77702: in run() - task 127b8e07-fff9-0b04-2570-0000000000dd 10202 1727204039.77726: variable 'ansible_search_path' from source: unknown 10202 1727204039.77740: variable 'ansible_search_path' from source: unknown 10202 1727204039.77792: calling self._execute() 10202 1727204039.77888: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204039.77959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204039.77963: variable 'omit' from source: magic vars 10202 1727204039.78554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204039.83058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204039.83251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204039.83472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204039.83476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204039.83492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204039.83707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204039.83826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204039.83862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204039.83955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204039.84049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204039.84453: variable 'ansible_facts' from source: unknown 10202 1727204039.84671: variable 'network_test_required_facts' from source: task vars 10202 1727204039.84729: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 10202 1727204039.84832: variable 'omit' from source: magic vars 10202 1727204039.84883: variable 'omit' from source: magic vars 10202 1727204039.84972: variable 'omit' from source: magic vars 10202 1727204039.85021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204039.85101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204039.85176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204039.85772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204039.85777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204039.85779: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204039.85785: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204039.85788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204039.85790: Set connection var ansible_shell_type to sh 10202 1727204039.85791: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204039.86371: Set connection var ansible_connection to ssh 10202 1727204039.86374: Set connection var ansible_shell_executable to /bin/sh 10202 1727204039.86377: Set connection var ansible_pipelining to False 10202 1727204039.86379: Set connection var ansible_timeout to 10 10202 1727204039.86381: variable 'ansible_shell_executable' from source: unknown 10202 1727204039.86383: variable 'ansible_connection' from source: unknown 10202 1727204039.86385: variable 'ansible_module_compression' from source: unknown 10202 1727204039.86387: variable 'ansible_shell_type' from source: unknown 10202 1727204039.86389: variable 'ansible_shell_executable' from source: unknown 10202 1727204039.86391: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204039.86393: variable 'ansible_pipelining' from source: unknown 10202 1727204039.86395: variable 'ansible_timeout' from source: unknown 10202 1727204039.86397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204039.86595: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204039.86611: variable 'omit' from source: magic vars 10202 1727204039.86619: starting attempt loop 10202 1727204039.86629: running the handler 10202 1727204039.86646: _low_level_execute_command(): starting 10202 1727204039.86658: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204039.88042: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204039.88168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204039.88184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204039.88283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204039.88520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204039.88616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204039.90463: stdout chunk (state=3): >>>/root <<< 10202 1727204039.90590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204039.90670: stderr chunk (state=3): >>><<< 10202 1727204039.90695: stdout chunk (state=3): >>><<< 10202 1727204039.90895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204039.90908: _low_level_execute_command(): starting 10202 1727204039.90911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513 `" && echo ansible-tmp-1727204039.9071896-10434-215372405551513="` echo /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513 `" ) && sleep 0' 10202 1727204039.92183: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204039.92187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204039.92192: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204039.92292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204039.92365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204039.92390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204039.92500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204039.94699: stdout chunk (state=3): >>>ansible-tmp-1727204039.9071896-10434-215372405551513=/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513 <<< 10202 1727204039.95086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204039.95090: stdout chunk (state=3): >>><<< 10202 1727204039.95093: stderr chunk (state=3): >>><<< 10202 1727204039.95096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204039.9071896-10434-215372405551513=/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204039.95099: variable 'ansible_module_compression' from source: unknown 10202 1727204039.95310: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10202 1727204039.95401: variable 'ansible_facts' from source: unknown 10202 1727204039.95773: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py 10202 1727204039.95993: Sending initial data 10202 1727204039.96032: Sent initial data (154 bytes) 10202 1727204039.96701: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204039.96788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204039.96826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204039.96849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204039.96896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204039.97055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204039.98890: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204039.99056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204039.99061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpgij77035 /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py <<< 10202 1727204039.99064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py" <<< 10202 1727204039.99155: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpgij77035" to remote "/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py" <<< 10202 1727204040.00979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.01041: stderr chunk (state=3): >>><<< 10202 1727204040.01052: stdout chunk (state=3): >>><<< 10202 1727204040.01149: done transferring module to remote 10202 1727204040.01175: _low_level_execute_command(): starting 10202 1727204040.01186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/ /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py && sleep 0' 10202 1727204040.02593: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.02725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.02731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.02734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.02985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.04825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.04945: stderr chunk (state=3): >>><<< 10202 1727204040.04976: stdout chunk (state=3): >>><<< 10202 1727204040.05005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204040.05014: _low_level_execute_command(): starting 10202 1727204040.05026: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/AnsiballZ_setup.py && sleep 0' 10202 1727204040.05780: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204040.05806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204040.05875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204040.05887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.05959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.05986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.06007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.06113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.08663: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10202 1727204040.08690: stdout chunk (state=3): >>>import _imp # builtin <<< 10202 1727204040.08712: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 10202 1727204040.08799: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 10202 1727204040.08889: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 10202 1727204040.08916: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 10202 1727204040.08975: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.08999: stdout chunk (state=3): >>>import '_codecs' # <<< 10202 1727204040.09033: stdout chunk (state=3): >>>import 'codecs' # <<< 10202 1727204040.09068: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10202 1727204040.09103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89519c530> <<< 10202 1727204040.09149: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89516bb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89519eab0> <<< 10202 1727204040.09171: stdout chunk (state=3): >>>import '_signal' # <<< 10202 1727204040.09222: stdout chunk (state=3): >>>import '_abc' # <<< 10202 1727204040.09226: stdout chunk (state=3): >>>import 'abc' # <<< 10202 1727204040.09229: stdout chunk (state=3): >>>import 'io' # <<< 10202 1727204040.09263: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10202 1727204040.09370: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10202 1727204040.09393: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10202 1727204040.09432: stdout chunk (state=3): >>>import 'os' # <<< 10202 1727204040.09469: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10202 1727204040.09497: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 10202 1727204040.09528: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10202 1727204040.09532: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10202 1727204040.09564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 10202 1727204040.09575: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f511c0> <<< 10202 1727204040.09650: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10202 1727204040.09654: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.09670: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f520c0> <<< 10202 1727204040.09689: stdout chunk (state=3): >>>import 'site' # <<< 10202 1727204040.09720: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10202 1727204040.10152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10202 1727204040.10183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 10202 1727204040.10204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10202 1727204040.10267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10202 1727204040.10286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10202 1727204040.10325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10202 1727204040.10329: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8ffe0> <<< 10202 1727204040.10357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10202 1727204040.10360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10202 1727204040.10404: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa4170> <<< 10202 1727204040.10418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10202 1727204040.10437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10202 1727204040.10468: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10202 1727204040.10520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.10568: stdout chunk (state=3): >>>import 'itertools' # <<< 10202 1727204040.10574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fc79b0> <<< 10202 1727204040.10623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 10202 1727204040.10626: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fc7f80> <<< 10202 1727204040.10642: stdout chunk (state=3): >>>import '_collections' # <<< 10202 1727204040.10694: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa7c50> <<< 10202 1727204040.10704: stdout chunk (state=3): >>>import '_functools' # <<< 10202 1727204040.10737: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa53d0> <<< 10202 1727204040.10844: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8d190> <<< 10202 1727204040.10864: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10202 1727204040.10887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10202 1727204040.10904: stdout chunk (state=3): >>>import '_sre' # <<< 10202 1727204040.10922: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10202 1727204040.10961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10202 1727204040.10981: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10202 1727204040.11014: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894feb920> <<< 10202 1727204040.11035: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fea540> <<< 10202 1727204040.11074: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa62a0> <<< 10202 1727204040.11092: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fe8d10> <<< 10202 1727204040.11137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 10202 1727204040.11152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8950189b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8c410> <<< 10202 1727204040.11181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 10202 1727204040.11218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10202 1727204040.11221: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.11239: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895018e60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895018d10> <<< 10202 1727204040.11271: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.11286: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895019100> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8af30> <<< 10202 1727204040.11321: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.11338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10202 1727204040.11395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 10202 1727204040.11398: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895019760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895019460> <<< 10202 1727204040.11431: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 10202 1727204040.11445: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 10202 1727204040.11477: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501a660> <<< 10202 1727204040.11491: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 10202 1727204040.11517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10202 1727204040.11553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10202 1727204040.11583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 10202 1727204040.11606: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895034860> <<< 10202 1727204040.11609: stdout chunk (state=3): >>>import 'errno' # <<< 10202 1727204040.11651: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895035fa0> <<< 10202 1727204040.11678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 10202 1727204040.11693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10202 1727204040.11720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 10202 1727204040.11740: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895036e10> <<< 10202 1727204040.11789: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895037470> <<< 10202 1727204040.11793: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895036390> <<< 10202 1727204040.11814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 10202 1727204040.11829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10202 1727204040.11873: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.11893: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895037e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895037560> <<< 10202 1727204040.11940: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501a5d0> <<< 10202 1727204040.11961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10202 1727204040.11995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10202 1727204040.12018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10202 1727204040.12035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10202 1727204040.12072: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d6fcb0> <<< 10202 1727204040.12107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10202 1727204040.12132: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d98560> <<< 10202 1727204040.12162: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.12190: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98830> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.12230: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d6de50> <<< 10202 1727204040.12244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10202 1727204040.12385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 10202 1727204040.12419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 10202 1727204040.12433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d9a0f0> <<< 10202 1727204040.12471: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d98d70> <<< 10202 1727204040.12496: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501ad80> <<< 10202 1727204040.12508: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10202 1727204040.12571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.12590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10202 1727204040.12634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10202 1727204040.12658: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894dc24b0> <<< 10202 1727204040.12721: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10202 1727204040.12740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.12767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10202 1727204040.12779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10202 1727204040.12832: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894dde5d0> <<< 10202 1727204040.12849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10202 1727204040.12896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10202 1727204040.12994: stdout chunk (state=3): >>>import 'ntpath' # <<< 10202 1727204040.13182: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e13350> <<< 10202 1727204040.13203: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10202 1727204040.13221: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e3daf0> <<< 10202 1727204040.13302: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e13470> <<< 10202 1727204040.13340: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894ddf260> <<< 10202 1727204040.13377: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c14470> <<< 10202 1727204040.13402: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894ddd610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d9b050> <<< 10202 1727204040.13580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10202 1727204040.13604: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa894ddd9d0> <<< 10202 1727204040.13784: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_l9ub0_69/ansible_setup_payload.zip' # zipimport: zlib available <<< 10202 1727204040.14056: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10202 1727204040.14120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10202 1727204040.14168: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c82210> import '_typing' # <<< 10202 1727204040.14383: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c59100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c58290> <<< 10202 1727204040.14427: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 10202 1727204040.14468: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.14498: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 10202 1727204040.16125: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.17524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c5b620> <<< 10202 1727204040.17553: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.17588: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10202 1727204040.17592: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10202 1727204040.17632: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb1c40> <<< 10202 1727204040.17671: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb19d0> <<< 10202 1727204040.17713: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb12e0> <<< 10202 1727204040.17740: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10202 1727204040.17775: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb1730> <<< 10202 1727204040.17797: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c82ea0> import 'atexit' # <<< 10202 1727204040.17827: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb2960> <<< 10202 1727204040.17858: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb2ba0> <<< 10202 1727204040.17888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10202 1727204040.17959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10202 1727204040.17973: stdout chunk (state=3): >>>import '_locale' # <<< 10202 1727204040.18016: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb3020> <<< 10202 1727204040.18039: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10202 1727204040.18076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10202 1727204040.18114: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b14d40> <<< 10202 1727204040.18144: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b16960> <<< 10202 1727204040.18178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10202 1727204040.18192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10202 1727204040.18239: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b17320> <<< 10202 1727204040.18251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10202 1727204040.18357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b184d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10202 1727204040.18395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10202 1727204040.18407: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10202 1727204040.18459: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1afc0> <<< 10202 1727204040.18503: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b1b320> <<< 10202 1727204040.18534: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b19280> <<< 10202 1727204040.18548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10202 1727204040.18592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10202 1727204040.18626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10202 1727204040.18630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10202 1727204040.18698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10202 1727204040.18722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1f050> import '_tokenize' # <<< 10202 1727204040.18825: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1db20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1d880> <<< 10202 1727204040.18828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 10202 1727204040.18850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10202 1727204040.18919: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1ff80> <<< 10202 1727204040.18955: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b19790> <<< 10202 1727204040.18984: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.19009: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b631d0> <<< 10202 1727204040.19043: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b63410> <<< 10202 1727204040.19050: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10202 1727204040.19094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10202 1727204040.19098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 10202 1727204040.19135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10202 1727204040.19143: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b68ec0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b68c80> <<< 10202 1727204040.19176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10202 1727204040.19303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10202 1727204040.19360: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b6b380> <<< 10202 1727204040.19396: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b694f0> <<< 10202 1727204040.19400: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10202 1727204040.19439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.19487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10202 1727204040.19490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 10202 1727204040.19507: stdout chunk (state=3): >>>import '_string' # <<< 10202 1727204040.19535: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b72ba0> <<< 10202 1727204040.19679: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b6b530> <<< 10202 1727204040.19761: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73950> <<< 10202 1727204040.19801: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73b90> <<< 10202 1727204040.19857: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73e30> <<< 10202 1727204040.19900: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b635f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 10202 1727204040.19904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10202 1727204040.19921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10202 1727204040.19953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10202 1727204040.19977: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.20015: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b77530> <<< 10202 1727204040.20200: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.20215: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b78800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b75cd0> <<< 10202 1727204040.20255: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.20275: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b77080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b75970> # zipimport: zlib available <<< 10202 1727204040.20311: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10202 1727204040.20324: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.20419: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.20536: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.20542: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 10202 1727204040.20583: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10202 1727204040.20586: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.20722: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.20852: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.21546: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.22317: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8949fc920> <<< 10202 1727204040.22414: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10202 1727204040.22444: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fd700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b788c0> <<< 10202 1727204040.22490: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10202 1727204040.22517: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.22536: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.22552: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10202 1727204040.22739: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.22914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10202 1727204040.22936: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fd4f0> <<< 10202 1727204040.22956: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.23500: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24035: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24103: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24190: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 10202 1727204040.24238: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24270: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10202 1727204040.24291: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24372: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24477: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10202 1727204040.24512: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24515: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 10202 1727204040.24526: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24553: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24600: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10202 1727204040.24621: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.24885: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.25160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10202 1727204040.25242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10202 1727204040.25246: stdout chunk (state=3): >>>import '_ast' # <<< 10202 1727204040.25337: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fe630> <<< 10202 1727204040.25340: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.25421: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.25512: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 10202 1727204040.25532: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 10202 1727204040.25558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10202 1727204040.25659: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.25792: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a06180> <<< 10202 1727204040.25869: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a06ae0> <<< 10202 1727204040.25887: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949ff4d0> <<< 10202 1727204040.25899: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.25946: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.25999: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10202 1727204040.26003: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26054: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26101: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26157: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26241: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10202 1727204040.26294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.26403: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a05970> <<< 10202 1727204040.26446: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a06c60> <<< 10202 1727204040.26482: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 10202 1727204040.26486: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 10202 1727204040.26558: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26899: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.26903: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10202 1727204040.26936: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9aed0> <<< 10202 1727204040.26992: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a10bf0> <<< 10202 1727204040.27073: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a0ed80> <<< 10202 1727204040.27102: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a0ebd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10202 1727204040.27123: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.27159: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10202 1727204040.27228: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10202 1727204040.27253: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.27276: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 10202 1727204040.27342: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27417: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27441: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27461: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27493: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27545: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27576: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 10202 1727204040.27634: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27770: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27796: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27825: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.27878: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 10202 1727204040.27881: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.28085: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.28278: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.28327: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.28414: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.28462: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10202 1727204040.28499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 10202 1727204040.28503: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9dd90> <<< 10202 1727204040.28538: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 10202 1727204040.28551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 10202 1727204040.28573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10202 1727204040.28613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10202 1727204040.28641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10202 1727204040.28662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3c530> <<< 10202 1727204040.28695: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.28717: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3c860> <<< 10202 1727204040.28775: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a7d580> <<< 10202 1727204040.28804: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a7c710> <<< 10202 1727204040.28831: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9c4d0> <<< 10202 1727204040.28854: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9c0e0> <<< 10202 1727204040.28868: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10202 1727204040.28932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 10202 1727204040.28970: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10202 1727204040.28979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 10202 1727204040.29031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10202 1727204040.29036: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3f800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3f0b0> <<< 10202 1727204040.29081: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3f290> <<< 10202 1727204040.29085: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3e510> <<< 10202 1727204040.29098: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10202 1727204040.29250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10202 1727204040.29267: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3f9b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10202 1727204040.29305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10202 1727204040.29339: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.29371: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893fa64e0> <<< 10202 1727204040.29407: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa4500> <<< 10202 1727204040.29411: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9d550> import 'ansible.module_utils.facts.timeout' # <<< 10202 1727204040.29442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 10202 1727204040.29481: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 10202 1727204040.29495: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.29545: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.29610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 10202 1727204040.29627: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.29717: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.29890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 10202 1727204040.29919: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.29970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 10202 1727204040.29998: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.30207: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.30244: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.30311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 10202 1727204040.30345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 10202 1727204040.30888: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 10202 1727204040.31416: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31451: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31515: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31546: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31585: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10202 1727204040.31608: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31629: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 10202 1727204040.31684: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31740: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10202 1727204040.31815: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31846: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10202 1727204040.31920: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31924: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.31963: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 10202 1727204040.31973: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32048: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa69f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10202 1727204040.32386: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa7650> <<< 10202 1727204040.32390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 10202 1727204040.32457: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10202 1727204040.32545: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32638: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 10202 1727204040.32758: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32822: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32900: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 10202 1727204040.32919: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.32956: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.33005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10202 1727204040.33063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10202 1727204040.33291: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893fdaae0> <<< 10202 1727204040.33443: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fc27b0> <<< 10202 1727204040.33464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 10202 1727204040.33522: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.33686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.33831: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.33896: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34063: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10202 1727204040.34082: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34119: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 10202 1727204040.34226: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34240: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10202 1727204040.34340: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.34344: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893ff6090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893ff5d60> import 'ansible.module_utils.facts.system.user' # <<< 10202 1727204040.34519: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 10202 1727204040.34523: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 10202 1727204040.34712: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 10202 1727204040.34882: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.34953: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35073: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35160: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 10202 1727204040.35260: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35263: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35377: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 10202 1727204040.35558: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35692: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10202 1727204040.35938: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.35940: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.36550: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.37301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.37404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 10202 1727204040.37422: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.37548: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.37636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 10202 1727204040.37649: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.37809: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10202 1727204040.38030: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.38033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 10202 1727204040.38084: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38126: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 10202 1727204040.38140: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38241: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38346: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38580: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10202 1727204040.38852: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38910: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.38925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 10202 1727204040.38981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 10202 1727204040.39072: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.39133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 10202 1727204040.39190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 10202 1727204040.39297: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.39315: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.39339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 10202 1727204040.39404: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.39472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 10202 1727204040.39494: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.39818: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 10202 1727204040.40163: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 10202 1727204040.40236: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40278: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 10202 1727204040.40341: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40351: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 10202 1727204040.40428: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 10202 1727204040.40560: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.40699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 10202 1727204040.40774: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.40985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.40988: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.41035: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.41100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 10202 1727204040.41190: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.41718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 10202 1727204040.41721: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # <<< 10202 1727204040.41723: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.41761: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.41812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 10202 1727204040.41830: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.41868: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.41918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 10202 1727204040.42012: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.42031: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.42118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 10202 1727204040.42134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 10202 1727204040.42311: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.42336: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10202 1727204040.42446: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.43562: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893e1ec30> <<< 10202 1727204040.43586: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893e1d2e0> <<< 10202 1727204040.43629: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893e1dd60> <<< 10202 1727204040.44075: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "00", "epoch": "1727204040", "epoch_int": "1727204040", "date": "2024-09-24", "time": "14:54:00", "iso8601_micro": "2024-09-24T18:54:00.438616Z", "iso8601": "2024-09-24T18:54:00Z", "iso8601_basic": "20240924T145400438616", "iso8601_basic_short": "20240924T145400", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root<<< 10202 1727204040.44110: stdout chunk (state=3): >>>/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10202 1727204040.44750: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10202 1727204040.44783: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 10202 1727204040.44879: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 10202 1727204040.44901: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 10202 1727204040.44929: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 10202 1727204040.45132: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector<<< 10202 1727204040.45135: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 10202 1727204040.45543: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10202 1727204040.45567: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 10202 1727204040.45596: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10202 1727204040.45684: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 10202 1727204040.45731: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 10202 1727204040.45735: stdout chunk (state=3): >>># destroy locale # destroy select <<< 10202 1727204040.45745: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 10202 1727204040.46014: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 10202 1727204040.46018: stdout chunk (state=3): >>># destroy _ssl <<< 10202 1727204040.46064: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 10202 1727204040.46074: stdout chunk (state=3): >>># destroy termios # destroy errno # destroy json <<< 10202 1727204040.46105: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 10202 1727204040.46126: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 10202 1727204040.46192: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 10202 1727204040.46270: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 10202 1727204040.46325: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 10202 1727204040.46494: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10202 1727204040.46667: stdout chunk (state=3): >>># destroy sys.monitoring <<< 10202 1727204040.46691: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 10202 1727204040.46715: stdout chunk (state=3): >>># destroy platform <<< 10202 1727204040.46736: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 10202 1727204040.46761: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 10202 1727204040.46818: stdout chunk (state=3): >>># destroy _typing <<< 10202 1727204040.46832: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 10202 1727204040.46860: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10202 1727204040.46973: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 10202 1727204040.46986: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 10202 1727204040.47101: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 10202 1727204040.47105: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 10202 1727204040.47690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204040.47774: stderr chunk (state=3): >>><<< 10202 1727204040.48176: stdout chunk (state=3): >>><<< 10202 1727204040.48386: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89519c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89516bb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89519eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f511c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f520c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8ffe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa4170> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fc79b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fc7f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa7c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa53d0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8d190> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894feb920> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fea540> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fa62a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894fe8d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8950189b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8c410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895018e60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895018d10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895019100> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894f8af30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895019760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895019460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501a660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895034860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895035fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895036e10> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895037470> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895036390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa895037e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa895037560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501a5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d6fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d98560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98830> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894d98a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d6de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d9a0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d98d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa89501ad80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894dc24b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894dde5d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e13350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e3daf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894e13470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894ddf260> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c14470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894ddd610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894d9b050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa894ddd9d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_l9ub0_69/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c82210> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c59100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c58290> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c5b620> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb1c40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb19d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb12e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb1730> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894c82ea0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb2960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894cb2ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894cb3020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b14d40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b16960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b17320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b184d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1afc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b1b320> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b19280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1f050> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1db20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1d880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b1ff80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b19790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b631d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b63410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b68ec0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b68c80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b6b380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b694f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b72ba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b6b530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73950> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b73e30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b635f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b77530> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b78800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b75cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894b77080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b75970> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa8949fc920> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fd700> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894b788c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fd4f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949fe630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a06180> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a06ae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa8949ff4d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa894a05970> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a06c60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9aed0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a10bf0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a0ed80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a0ebd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9dd90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3c530> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3c860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a7d580> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a7c710> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9c4d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9c0e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3f800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3f0b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893f3f290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3e510> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893f3f9b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893fa64e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa4500> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa894a9d550> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa69f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fa7650> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893fdaae0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893fc27b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893ff6090> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893ff5d60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa893e1ec30> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893e1d2e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa893e1dd60> {"ansible_facts": {"ansible_fips": false, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "00", "epoch": "1727204040", "epoch_int": "1727204040", "date": "2024-09-24", "time": "14:54:00", "iso8601_micro": "2024-09-24T18:54:00.438616Z", "iso8601": "2024-09-24T18:54:00Z", "iso8601_basic": "20240924T145400438616", "iso8601_basic_short": "20240924T145400", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10202 1727204040.50072: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204040.50076: _low_level_execute_command(): starting 10202 1727204040.50079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204039.9071896-10434-215372405551513/ > /dev/null 2>&1 && sleep 0' 10202 1727204040.50081: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204040.50084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204040.50086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204040.50088: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204040.50154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204040.50157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.50261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.50275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.50287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.50402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.52527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.52605: stderr chunk (state=3): >>><<< 10202 1727204040.52613: stdout chunk (state=3): >>><<< 10202 1727204040.52636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204040.52772: handler run complete 10202 1727204040.52954: variable 'ansible_facts' from source: unknown 10202 1727204040.52957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204040.53075: variable 'ansible_facts' from source: unknown 10202 1727204040.53141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204040.53206: attempt loop complete, returning result 10202 1727204040.53211: _execute() done 10202 1727204040.53221: dumping result to json 10202 1727204040.53242: done dumping result, returning 10202 1727204040.53251: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-0b04-2570-0000000000dd] 10202 1727204040.53257: sending task result for task 127b8e07-fff9-0b04-2570-0000000000dd 10202 1727204040.53445: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000dd 10202 1727204040.53449: WORKER PROCESS EXITING ok: [managed-node3] 10202 1727204040.53619: no more pending results, returning what we have 10202 1727204040.53622: results queue empty 10202 1727204040.53626: checking for any_errors_fatal 10202 1727204040.53628: done checking for any_errors_fatal 10202 1727204040.53628: checking for max_fail_percentage 10202 1727204040.53630: done checking for max_fail_percentage 10202 1727204040.53631: checking to see if all hosts have failed and the running result is not ok 10202 1727204040.53632: done checking to see if all hosts have failed 10202 1727204040.53633: getting the remaining hosts for this loop 10202 1727204040.53635: done getting the remaining hosts for this loop 10202 1727204040.53639: getting the next task for host managed-node3 10202 1727204040.53649: done getting next task for host managed-node3 10202 1727204040.53651: ^ task is: TASK: Check if system is ostree 10202 1727204040.53654: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204040.53658: getting variables 10202 1727204040.53660: in VariableManager get_vars() 10202 1727204040.53906: Calling all_inventory to load vars for managed-node3 10202 1727204040.53910: Calling groups_inventory to load vars for managed-node3 10202 1727204040.53913: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204040.53927: Calling all_plugins_play to load vars for managed-node3 10202 1727204040.53930: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204040.53933: Calling groups_plugins_play to load vars for managed-node3 10202 1727204040.54193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204040.54612: done with get_vars() 10202 1727204040.54627: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.781) 0:00:02.223 ***** 10202 1727204040.54856: entering _queue_task() for managed-node3/stat 10202 1727204040.55406: worker is 1 (out of 1 available) 10202 1727204040.55427: exiting _queue_task() for managed-node3/stat 10202 1727204040.55440: done queuing things up, now waiting for results queue to drain 10202 1727204040.55441: waiting for pending results... 10202 1727204040.55763: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 10202 1727204040.55858: in run() - task 127b8e07-fff9-0b04-2570-0000000000df 10202 1727204040.55862: variable 'ansible_search_path' from source: unknown 10202 1727204040.55866: variable 'ansible_search_path' from source: unknown 10202 1727204040.55876: calling self._execute() 10202 1727204040.55969: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204040.55983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204040.55999: variable 'omit' from source: magic vars 10202 1727204040.56695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204040.57069: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204040.57113: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204040.57162: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204040.57271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204040.57317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204040.57351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204040.57393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204040.57431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204040.57574: Evaluated conditional (not __network_is_ostree is defined): True 10202 1727204040.57601: variable 'omit' from source: magic vars 10202 1727204040.57649: variable 'omit' from source: magic vars 10202 1727204040.57710: variable 'omit' from source: magic vars 10202 1727204040.57741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204040.57819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204040.57822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204040.57828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204040.57830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204040.57930: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204040.57933: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204040.57935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204040.57988: Set connection var ansible_shell_type to sh 10202 1727204040.58001: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204040.58012: Set connection var ansible_connection to ssh 10202 1727204040.58025: Set connection var ansible_shell_executable to /bin/sh 10202 1727204040.58041: Set connection var ansible_pipelining to False 10202 1727204040.58057: Set connection var ansible_timeout to 10 10202 1727204040.58090: variable 'ansible_shell_executable' from source: unknown 10202 1727204040.58099: variable 'ansible_connection' from source: unknown 10202 1727204040.58106: variable 'ansible_module_compression' from source: unknown 10202 1727204040.58113: variable 'ansible_shell_type' from source: unknown 10202 1727204040.58120: variable 'ansible_shell_executable' from source: unknown 10202 1727204040.58147: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204040.58149: variable 'ansible_pipelining' from source: unknown 10202 1727204040.58152: variable 'ansible_timeout' from source: unknown 10202 1727204040.58269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204040.58321: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204040.58340: variable 'omit' from source: magic vars 10202 1727204040.58349: starting attempt loop 10202 1727204040.58356: running the handler 10202 1727204040.58377: _low_level_execute_command(): starting 10202 1727204040.58398: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204040.59374: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204040.59396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.59451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.59474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.59901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.61748: stdout chunk (state=3): >>>/root <<< 10202 1727204040.62016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.62020: stdout chunk (state=3): >>><<< 10202 1727204040.62022: stderr chunk (state=3): >>><<< 10202 1727204040.62052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204040.62144: _low_level_execute_command(): starting 10202 1727204040.62149: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093 `" && echo ansible-tmp-1727204040.6207376-10476-216396722400093="` echo /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093 `" ) && sleep 0' 10202 1727204040.63659: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.63690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204040.63731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.63811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.63840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.63884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.64046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.66207: stdout chunk (state=3): >>>ansible-tmp-1727204040.6207376-10476-216396722400093=/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093 <<< 10202 1727204040.66402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.66447: stderr chunk (state=3): >>><<< 10202 1727204040.66457: stdout chunk (state=3): >>><<< 10202 1727204040.66489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204040.6207376-10476-216396722400093=/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204040.66632: variable 'ansible_module_compression' from source: unknown 10202 1727204040.66648: ANSIBALLZ: Using lock for stat 10202 1727204040.66656: ANSIBALLZ: Acquiring lock 10202 1727204040.66664: ANSIBALLZ: Lock acquired: 140045305565680 10202 1727204040.66676: ANSIBALLZ: Creating module 10202 1727204040.81946: ANSIBALLZ: Writing module into payload 10202 1727204040.82100: ANSIBALLZ: Writing module 10202 1727204040.82106: ANSIBALLZ: Renaming module 10202 1727204040.82109: ANSIBALLZ: Done creating module 10202 1727204040.82132: variable 'ansible_facts' from source: unknown 10202 1727204040.82213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py 10202 1727204040.82493: Sending initial data 10202 1727204040.82497: Sent initial data (153 bytes) 10202 1727204040.83202: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204040.83225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.83242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.83482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.85290: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10202 1727204040.85324: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204040.85393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204040.85481: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpkfomc_y2 /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py <<< 10202 1727204040.85484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py" <<< 10202 1727204040.85541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpkfomc_y2" to remote "/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py" <<< 10202 1727204040.86525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.86685: stderr chunk (state=3): >>><<< 10202 1727204040.86689: stdout chunk (state=3): >>><<< 10202 1727204040.86691: done transferring module to remote 10202 1727204040.86694: _low_level_execute_command(): starting 10202 1727204040.86696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/ /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py && sleep 0' 10202 1727204040.87542: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204040.87562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.87581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.87607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.87788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.89828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204040.89925: stderr chunk (state=3): >>><<< 10202 1727204040.89928: stdout chunk (state=3): >>><<< 10202 1727204040.89946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204040.89955: _low_level_execute_command(): starting 10202 1727204040.89967: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/AnsiballZ_stat.py && sleep 0' 10202 1727204040.90604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204040.90621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204040.90636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204040.90657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204040.90678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204040.90690: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204040.90704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.90722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204040.90786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204040.90828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204040.90856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204040.90873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204040.91102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204040.93660: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10202 1727204040.93678: stdout chunk (state=3): >>>import _imp # builtin <<< 10202 1727204040.93694: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 10202 1727204040.93767: stdout chunk (state=3): >>>import '_io' # <<< 10202 1727204040.93781: stdout chunk (state=3): >>>import 'marshal' # <<< 10202 1727204040.93804: stdout chunk (state=3): >>>import 'posix' # <<< 10202 1727204040.93880: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10202 1727204040.93885: stdout chunk (state=3): >>>import 'time' # <<< 10202 1727204040.93888: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 10202 1727204040.94172: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 10202 1727204040.94176: stdout chunk (state=3): >>>import 'codecs' # <<< 10202 1727204040.94211: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d89118530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d890e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d8911aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 10202 1727204040.94313: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10202 1727204040.94341: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10202 1727204040.94373: stdout chunk (state=3): >>>import 'os' # <<< 10202 1727204040.94428: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 10202 1727204040.94449: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10202 1727204040.94488: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10202 1727204040.94504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88ecd190> <<< 10202 1727204040.94580: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10202 1727204040.94593: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88ece090> <<< 10202 1727204040.94617: stdout chunk (state=3): >>>import 'site' # <<< 10202 1727204040.94653: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10202 1727204040.94904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10202 1727204040.94928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 10202 1727204040.94967: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.94971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10202 1727204040.95053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10202 1727204040.95073: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10202 1727204040.95138: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f0bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10202 1727204040.95218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f200e0> <<< 10202 1727204040.95232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10202 1727204040.95277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.95301: stdout chunk (state=3): >>>import 'itertools' # <<< 10202 1727204040.95421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f43950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 10202 1727204040.95424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f43fe0> import '_collections' # <<< 10202 1727204040.95445: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f23c20> import '_functools' # <<< 10202 1727204040.95598: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f21340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f09100> <<< 10202 1727204040.95618: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10202 1727204040.95640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10202 1727204040.95658: stdout chunk (state=3): >>>import '_sre' # <<< 10202 1727204040.95677: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10202 1727204040.95697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10202 1727204040.95720: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 10202 1727204040.95736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10202 1727204040.95780: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f678c0> <<< 10202 1727204040.95814: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f664e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f221e0> <<< 10202 1727204040.95893: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f64d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f94950> <<< 10202 1727204040.95914: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f08380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10202 1727204040.96026: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88f94e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f94cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88f95070> <<< 10202 1727204040.96039: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f06ea0> <<< 10202 1727204040.96062: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.96126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10202 1727204040.96200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f95730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f95400> import 'importlib.machinery' # <<< 10202 1727204040.96205: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96600> <<< 10202 1727204040.96242: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 10202 1727204040.96245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10202 1727204040.96557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10202 1727204040.96562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb0830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb1f70> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb2e10> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb3440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb2360> <<< 10202 1727204040.96569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10202 1727204040.96590: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb3e30> <<< 10202 1727204040.96603: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb3560> <<< 10202 1727204040.96650: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96660> <<< 10202 1727204040.96668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10202 1727204040.96714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10202 1727204040.96739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10202 1727204040.96772: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204040.96796: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88d8fce0> <<< 10202 1727204040.96841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db8470> <<< 10202 1727204040.97103: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8740> <<< 10202 1727204040.97106: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88d8deb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db9fa0> <<< 10202 1727204040.97130: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db8c20> <<< 10202 1727204040.97147: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96d50> <<< 10202 1727204040.97179: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10202 1727204040.97237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.97257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10202 1727204040.97312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10202 1727204040.97339: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88de6360> <<< 10202 1727204040.97382: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10202 1727204040.97483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204040.97507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dfe4b0> <<< 10202 1727204040.97519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10202 1727204040.97562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10202 1727204040.97623: stdout chunk (state=3): >>>import 'ntpath' # <<< 10202 1727204040.97648: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e3b260> <<< 10202 1727204040.97672: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10202 1727204040.97740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10202 1727204040.97743: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10202 1727204040.97889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10202 1727204040.97892: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e5da00> <<< 10202 1727204040.97961: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e3b380> <<< 10202 1727204040.98010: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dff140> <<< 10202 1727204040.98034: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 10202 1727204040.98057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c3c380> <<< 10202 1727204040.98200: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dfd4f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dbaf00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10202 1727204040.98203: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6d88dfd280> <<< 10202 1727204040.98285: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_mbfwy3bt/ansible_stat_payload.zip' # zipimport: zlib available <<< 10202 1727204040.98445: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204040.98476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 10202 1727204040.98693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10202 1727204040.98697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c92060> import '_typing' # <<< 10202 1727204040.98873: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c68f50> <<< 10202 1727204040.98892: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c680e0> # zipimport: zlib available <<< 10202 1727204040.98915: stdout chunk (state=3): >>>import 'ansible' # <<< 10202 1727204040.98947: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10202 1727204040.98971: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 10202 1727204040.98993: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.00643: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.02046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c6bec0> <<< 10202 1727204041.02067: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204041.02121: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10202 1727204041.02161: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbda00> <<< 10202 1727204041.02232: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbd790> <<< 10202 1727204041.02244: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbd0d0> <<< 10202 1727204041.02339: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 10202 1727204041.02357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbdb20> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c92ae0> import 'atexit' # <<< 10202 1727204041.02381: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbe750> <<< 10202 1727204041.02403: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbe990> <<< 10202 1727204041.02440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10202 1727204041.02559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10202 1727204041.02564: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbeed0> <<< 10202 1727204041.02586: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10202 1727204041.02615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10202 1727204041.02656: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b20c50> <<< 10202 1727204041.02751: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b22870> <<< 10202 1727204041.02777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b23230> <<< 10202 1727204041.02798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10202 1727204041.02836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10202 1727204041.02879: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b24410> <<< 10202 1727204041.02882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10202 1727204041.02948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10202 1727204041.02952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10202 1727204041.03101: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b26ea0> <<< 10202 1727204041.03104: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b26fc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b25160> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10202 1727204041.03136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10202 1727204041.03156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 10202 1727204041.03220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10202 1727204041.03246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10202 1727204041.03312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b2adb0> <<< 10202 1727204041.03315: stdout chunk (state=3): >>>import '_tokenize' # <<< 10202 1727204041.03361: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b29880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b295e0> <<< 10202 1727204041.03420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 10202 1727204041.03423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10202 1727204041.03508: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b2bce0> <<< 10202 1727204041.03564: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b25670> <<< 10202 1727204041.03588: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b72ed0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b73080> <<< 10202 1727204041.03868: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10202 1727204041.03916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10202 1727204041.03920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b78c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b78a40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10202 1727204041.03935: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7b1d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b79370> <<< 10202 1727204041.03954: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10202 1727204041.04012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204041.04031: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10202 1727204041.04056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 10202 1727204041.04103: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b7e9f0> <<< 10202 1727204041.04246: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b7b380> <<< 10202 1727204041.04336: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204041.04358: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fc50> <<< 10202 1727204041.04379: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fa10> <<< 10202 1727204041.04417: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204041.04445: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b73350> <<< 10202 1727204041.04464: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10202 1727204041.04485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10202 1727204041.04521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10202 1727204041.04549: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204041.04581: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b833e0> <<< 10202 1727204041.04764: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204041.04882: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b84620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b81b50> <<< 10202 1727204041.04913: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b82f00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b81760> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 10202 1727204041.04975: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.05079: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.05107: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 10202 1727204041.05192: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10202 1727204041.05196: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.05284: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.05420: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.06085: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.06739: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10202 1727204041.06774: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 10202 1727204041.06793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204041.06850: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88c0c6e0> <<< 10202 1727204041.06983: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10202 1727204041.06995: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0d520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b80a40> <<< 10202 1727204041.07078: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204041.07100: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10202 1727204041.07277: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.07484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0d4f0> <<< 10202 1727204041.07487: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08197: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08573: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08646: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08729: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10202 1727204041.08750: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08782: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08824: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10202 1727204041.08836: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.08910: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.09003: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10202 1727204041.09033: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 10202 1727204041.09082: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.09103: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.09370: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10202 1727204041.09385: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.09581: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.09988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 10202 1727204041.10018: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0e330> # zipimport: zlib available # zipimport: zlib available <<< 10202 1727204041.10042: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 10202 1727204041.10075: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 10202 1727204041.10100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10202 1727204041.10203: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10202 1727204041.10336: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a1a120> <<< 10202 1727204041.10395: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a1aa50> <<< 10202 1727204041.10419: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0f2f0> <<< 10202 1727204041.10441: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10479: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10518: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10202 1727204041.10537: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10578: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10630: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10691: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.10773: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10202 1727204041.10824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10202 1727204041.10932: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a197c0> <<< 10202 1727204041.11018: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a1ac00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10202 1727204041.11036: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11105: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11171: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11200: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11303: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10202 1727204041.11335: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10202 1727204041.11339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10202 1727204041.11399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10202 1727204041.11538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10202 1727204041.11561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88aaad50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a249b0> <<< 10202 1727204041.11653: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a22b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a229f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10202 1727204041.11687: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11701: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11727: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10202 1727204041.11786: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10202 1727204041.11871: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 10202 1727204041.11888: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.11998: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.12224: stdout chunk (state=3): >>># zipimport: zlib available <<< 10202 1727204041.12368: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 10202 1727204041.12872: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 10202 1727204041.12877: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 10202 1727204041.12908: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 10202 1727204041.12920: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 10202 1727204041.13059: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 10202 1727204041.13287: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10202 1727204041.13313: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 10202 1727204041.13347: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal <<< 10202 1727204041.13419: stdout chunk (state=3): >>># destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog <<< 10202 1727204041.13444: stdout chunk (state=3): >>># destroy uuid # destroy selectors # destroy errno # destroy array <<< 10202 1727204041.13515: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 10202 1727204041.13559: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 10202 1727204041.13715: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 10202 1727204041.13749: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 10202 1727204041.13774: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux <<< 10202 1727204041.13796: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10202 1727204041.13970: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 10202 1727204041.14013: stdout chunk (state=3): >>># destroy _collections <<< 10202 1727204041.14298: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 10202 1727204041.14302: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 10202 1727204041.14322: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 10202 1727204041.14348: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 10202 1727204041.14368: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10202 1727204041.14852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204041.14877: stdout chunk (state=3): >>><<< 10202 1727204041.14880: stderr chunk (state=3): >>><<< 10202 1727204041.14975: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d89118530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d890e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d8911aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88ecd190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88ece090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f0bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f200e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f43950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f43fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f23c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f21340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f09100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f678c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f664e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f221e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f64d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f94950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f08380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88f94e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f94cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88f95070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f06ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f95730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f95400> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96600> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb0830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb1f70> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb2e10> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb3440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb2360> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88fb3e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88fb3560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88d8fce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db8470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88db8920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88d8deb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db9fa0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88db8c20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88f96d50> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88de6360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dfe4b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e3b260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e5da00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88e3b380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dff140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c3c380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dfd4f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88dbaf00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6d88dfd280> # zipimport: found 30 names in '/tmp/ansible_stat_payload_mbfwy3bt/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c92060> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c68f50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c680e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c6bec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbda00> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbd790> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbd0d0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbdb20> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c92ae0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbe750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88cbe990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88cbeed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b20c50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b22870> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b23230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b24410> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b26ea0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b26fc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b25160> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b2adb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b29880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b295e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b2bce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b25670> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b72ed0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b73080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b78c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b78a40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7b1d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b79370> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b7e9f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b7b380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fc50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fa10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b7fad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b73350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b833e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b84620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b81b50> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88b82f00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b81760> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88c0c6e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0d520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88b80a40> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0d4f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0e330> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a1a120> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a1aa50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88c0f2f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6d88a197c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a1ac00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88aaad50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a249b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a22b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6d88a229f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10202 1727204041.16823: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204041.16827: _low_level_execute_command(): starting 10202 1727204041.16830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204040.6207376-10476-216396722400093/ > /dev/null 2>&1 && sleep 0' 10202 1727204041.16832: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204041.17158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204041.17162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204041.17167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.17169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204041.19337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204041.19343: stdout chunk (state=3): >>><<< 10202 1727204041.19418: stderr chunk (state=3): >>><<< 10202 1727204041.19422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204041.19425: handler run complete 10202 1727204041.19427: attempt loop complete, returning result 10202 1727204041.19429: _execute() done 10202 1727204041.19431: dumping result to json 10202 1727204041.19433: done dumping result, returning 10202 1727204041.19436: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [127b8e07-fff9-0b04-2570-0000000000df] 10202 1727204041.19438: sending task result for task 127b8e07-fff9-0b04-2570-0000000000df 10202 1727204041.19777: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000df 10202 1727204041.19781: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10202 1727204041.19851: no more pending results, returning what we have 10202 1727204041.19854: results queue empty 10202 1727204041.19855: checking for any_errors_fatal 10202 1727204041.19863: done checking for any_errors_fatal 10202 1727204041.19864: checking for max_fail_percentage 10202 1727204041.19868: done checking for max_fail_percentage 10202 1727204041.19868: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.19869: done checking to see if all hosts have failed 10202 1727204041.19870: getting the remaining hosts for this loop 10202 1727204041.19872: done getting the remaining hosts for this loop 10202 1727204041.19876: getting the next task for host managed-node3 10202 1727204041.19883: done getting next task for host managed-node3 10202 1727204041.19886: ^ task is: TASK: Set flag to indicate system is ostree 10202 1727204041.19889: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.19892: getting variables 10202 1727204041.19894: in VariableManager get_vars() 10202 1727204041.19932: Calling all_inventory to load vars for managed-node3 10202 1727204041.19936: Calling groups_inventory to load vars for managed-node3 10202 1727204041.19940: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.19952: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.19954: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.19957: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.20553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.21190: done with get_vars() 10202 1727204041.21202: done getting variables 10202 1727204041.21314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.664) 0:00:02.888 ***** 10202 1727204041.21351: entering _queue_task() for managed-node3/set_fact 10202 1727204041.21353: Creating lock for set_fact 10202 1727204041.21749: worker is 1 (out of 1 available) 10202 1727204041.21761: exiting _queue_task() for managed-node3/set_fact 10202 1727204041.21998: done queuing things up, now waiting for results queue to drain 10202 1727204041.22000: waiting for pending results... 10202 1727204041.22260: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 10202 1727204041.22267: in run() - task 127b8e07-fff9-0b04-2570-0000000000e0 10202 1727204041.22271: variable 'ansible_search_path' from source: unknown 10202 1727204041.22274: variable 'ansible_search_path' from source: unknown 10202 1727204041.22277: calling self._execute() 10202 1727204041.22370: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.22379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.22389: variable 'omit' from source: magic vars 10202 1727204041.22970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204041.23290: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204041.23357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204041.23401: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204041.23453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204041.23611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204041.23655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204041.23689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204041.23723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204041.23953: Evaluated conditional (not __network_is_ostree is defined): True 10202 1727204041.23957: variable 'omit' from source: magic vars 10202 1727204041.23961: variable 'omit' from source: magic vars 10202 1727204041.24094: variable '__ostree_booted_stat' from source: set_fact 10202 1727204041.24154: variable 'omit' from source: magic vars 10202 1727204041.24283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204041.24288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204041.24291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204041.24294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.24310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.24350: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204041.24358: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.24369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.24516: Set connection var ansible_shell_type to sh 10202 1727204041.24519: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204041.24532: Set connection var ansible_connection to ssh 10202 1727204041.24610: Set connection var ansible_shell_executable to /bin/sh 10202 1727204041.24613: Set connection var ansible_pipelining to False 10202 1727204041.24615: Set connection var ansible_timeout to 10 10202 1727204041.24617: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.24622: variable 'ansible_connection' from source: unknown 10202 1727204041.24627: variable 'ansible_module_compression' from source: unknown 10202 1727204041.24629: variable 'ansible_shell_type' from source: unknown 10202 1727204041.24631: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.24633: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.24635: variable 'ansible_pipelining' from source: unknown 10202 1727204041.24637: variable 'ansible_timeout' from source: unknown 10202 1727204041.24639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.24829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204041.24833: variable 'omit' from source: magic vars 10202 1727204041.24835: starting attempt loop 10202 1727204041.24837: running the handler 10202 1727204041.24839: handler run complete 10202 1727204041.24841: attempt loop complete, returning result 10202 1727204041.24843: _execute() done 10202 1727204041.24845: dumping result to json 10202 1727204041.24846: done dumping result, returning 10202 1727204041.24848: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [127b8e07-fff9-0b04-2570-0000000000e0] 10202 1727204041.24852: sending task result for task 127b8e07-fff9-0b04-2570-0000000000e0 ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 10202 1727204041.25029: no more pending results, returning what we have 10202 1727204041.25032: results queue empty 10202 1727204041.25033: checking for any_errors_fatal 10202 1727204041.25041: done checking for any_errors_fatal 10202 1727204041.25042: checking for max_fail_percentage 10202 1727204041.25043: done checking for max_fail_percentage 10202 1727204041.25044: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.25045: done checking to see if all hosts have failed 10202 1727204041.25046: getting the remaining hosts for this loop 10202 1727204041.25048: done getting the remaining hosts for this loop 10202 1727204041.25052: getting the next task for host managed-node3 10202 1727204041.25061: done getting next task for host managed-node3 10202 1727204041.25168: ^ task is: TASK: Fix CentOS6 Base repo 10202 1727204041.25172: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.25179: getting variables 10202 1727204041.25181: in VariableManager get_vars() 10202 1727204041.25216: Calling all_inventory to load vars for managed-node3 10202 1727204041.25220: Calling groups_inventory to load vars for managed-node3 10202 1727204041.25227: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.25239: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.25242: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.25245: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.25617: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000e0 10202 1727204041.25632: WORKER PROCESS EXITING 10202 1727204041.25969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.26160: done with get_vars() 10202 1727204041.26379: done getting variables 10202 1727204041.26516: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.051) 0:00:02.940 ***** 10202 1727204041.26548: entering _queue_task() for managed-node3/copy 10202 1727204041.27022: worker is 1 (out of 1 available) 10202 1727204041.27035: exiting _queue_task() for managed-node3/copy 10202 1727204041.27050: done queuing things up, now waiting for results queue to drain 10202 1727204041.27051: waiting for pending results... 10202 1727204041.27387: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 10202 1727204041.27793: in run() - task 127b8e07-fff9-0b04-2570-0000000000e2 10202 1727204041.27797: variable 'ansible_search_path' from source: unknown 10202 1727204041.27804: variable 'ansible_search_path' from source: unknown 10202 1727204041.27808: calling self._execute() 10202 1727204041.27811: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.27813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.27815: variable 'omit' from source: magic vars 10202 1727204041.28599: variable 'ansible_distribution' from source: facts 10202 1727204041.28626: Evaluated conditional (ansible_distribution == 'CentOS'): False 10202 1727204041.28631: when evaluation is False, skipping this task 10202 1727204041.28633: _execute() done 10202 1727204041.28636: dumping result to json 10202 1727204041.28638: done dumping result, returning 10202 1727204041.28644: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [127b8e07-fff9-0b04-2570-0000000000e2] 10202 1727204041.28652: sending task result for task 127b8e07-fff9-0b04-2570-0000000000e2 10202 1727204041.28886: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000e2 10202 1727204041.28891: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 10202 1727204041.28968: no more pending results, returning what we have 10202 1727204041.28972: results queue empty 10202 1727204041.28973: checking for any_errors_fatal 10202 1727204041.28979: done checking for any_errors_fatal 10202 1727204041.28980: checking for max_fail_percentage 10202 1727204041.28982: done checking for max_fail_percentage 10202 1727204041.28983: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.28984: done checking to see if all hosts have failed 10202 1727204041.28985: getting the remaining hosts for this loop 10202 1727204041.28987: done getting the remaining hosts for this loop 10202 1727204041.28991: getting the next task for host managed-node3 10202 1727204041.28999: done getting next task for host managed-node3 10202 1727204041.29003: ^ task is: TASK: Include the task 'enable_epel.yml' 10202 1727204041.29006: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.29010: getting variables 10202 1727204041.29012: in VariableManager get_vars() 10202 1727204041.29046: Calling all_inventory to load vars for managed-node3 10202 1727204041.29049: Calling groups_inventory to load vars for managed-node3 10202 1727204041.29054: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.29278: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.29283: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.29292: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.29887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.30324: done with get_vars() 10202 1727204041.30337: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.039) 0:00:02.980 ***** 10202 1727204041.30545: entering _queue_task() for managed-node3/include_tasks 10202 1727204041.31322: worker is 1 (out of 1 available) 10202 1727204041.31338: exiting _queue_task() for managed-node3/include_tasks 10202 1727204041.31570: done queuing things up, now waiting for results queue to drain 10202 1727204041.31572: waiting for pending results... 10202 1727204041.31899: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 10202 1727204041.32673: in run() - task 127b8e07-fff9-0b04-2570-0000000000e3 10202 1727204041.32679: variable 'ansible_search_path' from source: unknown 10202 1727204041.32683: variable 'ansible_search_path' from source: unknown 10202 1727204041.32688: calling self._execute() 10202 1727204041.32726: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.32742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.32758: variable 'omit' from source: magic vars 10202 1727204041.33921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204041.38298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204041.38387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204041.38435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204041.38497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204041.38529: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204041.38633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204041.38672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204041.38714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204041.38764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204041.38787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204041.38924: variable '__network_is_ostree' from source: set_fact 10202 1727204041.38950: Evaluated conditional (not __network_is_ostree | d(false)): True 10202 1727204041.38960: _execute() done 10202 1727204041.38969: dumping result to json 10202 1727204041.38976: done dumping result, returning 10202 1727204041.38985: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-0b04-2570-0000000000e3] 10202 1727204041.38994: sending task result for task 127b8e07-fff9-0b04-2570-0000000000e3 10202 1727204041.39109: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000e3 10202 1727204041.39170: no more pending results, returning what we have 10202 1727204041.39175: in VariableManager get_vars() 10202 1727204041.39213: Calling all_inventory to load vars for managed-node3 10202 1727204041.39216: Calling groups_inventory to load vars for managed-node3 10202 1727204041.39220: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.39233: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.39237: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.39240: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.39808: WORKER PROCESS EXITING 10202 1727204041.39838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.40289: done with get_vars() 10202 1727204041.40300: variable 'ansible_search_path' from source: unknown 10202 1727204041.40301: variable 'ansible_search_path' from source: unknown 10202 1727204041.40346: we have included files to process 10202 1727204041.40347: generating all_blocks data 10202 1727204041.40349: done generating all_blocks data 10202 1727204041.40356: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10202 1727204041.40358: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10202 1727204041.40360: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10202 1727204041.43011: done processing included file 10202 1727204041.43013: iterating over new_blocks loaded from include file 10202 1727204041.43015: in VariableManager get_vars() 10202 1727204041.43029: done with get_vars() 10202 1727204041.43032: filtering new block on tags 10202 1727204041.43058: done filtering new block on tags 10202 1727204041.43061: in VariableManager get_vars() 10202 1727204041.43077: done with get_vars() 10202 1727204041.43079: filtering new block on tags 10202 1727204041.43092: done filtering new block on tags 10202 1727204041.43094: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 10202 1727204041.43101: extending task lists for all hosts with included blocks 10202 1727204041.43227: done extending task lists 10202 1727204041.43229: done processing included files 10202 1727204041.43230: results queue empty 10202 1727204041.43231: checking for any_errors_fatal 10202 1727204041.43234: done checking for any_errors_fatal 10202 1727204041.43235: checking for max_fail_percentage 10202 1727204041.43236: done checking for max_fail_percentage 10202 1727204041.43237: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.43238: done checking to see if all hosts have failed 10202 1727204041.43238: getting the remaining hosts for this loop 10202 1727204041.43240: done getting the remaining hosts for this loop 10202 1727204041.43243: getting the next task for host managed-node3 10202 1727204041.43247: done getting next task for host managed-node3 10202 1727204041.43249: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 10202 1727204041.43253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.43255: getting variables 10202 1727204041.43256: in VariableManager get_vars() 10202 1727204041.43269: Calling all_inventory to load vars for managed-node3 10202 1727204041.43271: Calling groups_inventory to load vars for managed-node3 10202 1727204041.43273: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.43279: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.43285: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.43287: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.43432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.43626: done with get_vars() 10202 1727204041.43637: done getting variables 10202 1727204041.43720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204041.43949: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.134) 0:00:03.115 ***** 10202 1727204041.44000: entering _queue_task() for managed-node3/command 10202 1727204041.44002: Creating lock for command 10202 1727204041.44455: worker is 1 (out of 1 available) 10202 1727204041.44469: exiting _queue_task() for managed-node3/command 10202 1727204041.44484: done queuing things up, now waiting for results queue to drain 10202 1727204041.44486: waiting for pending results... 10202 1727204041.44969: running TaskExecutor() for managed-node3/TASK: Create EPEL 40 10202 1727204041.45422: in run() - task 127b8e07-fff9-0b04-2570-0000000000fd 10202 1727204041.45427: variable 'ansible_search_path' from source: unknown 10202 1727204041.45430: variable 'ansible_search_path' from source: unknown 10202 1727204041.45432: calling self._execute() 10202 1727204041.45717: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.45733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.45750: variable 'omit' from source: magic vars 10202 1727204041.46583: variable 'ansible_distribution' from source: facts 10202 1727204041.46773: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10202 1727204041.46777: when evaluation is False, skipping this task 10202 1727204041.46781: _execute() done 10202 1727204041.46783: dumping result to json 10202 1727204041.46786: done dumping result, returning 10202 1727204041.46790: done running TaskExecutor() for managed-node3/TASK: Create EPEL 40 [127b8e07-fff9-0b04-2570-0000000000fd] 10202 1727204041.46837: sending task result for task 127b8e07-fff9-0b04-2570-0000000000fd skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10202 1727204041.47270: no more pending results, returning what we have 10202 1727204041.47275: results queue empty 10202 1727204041.47276: checking for any_errors_fatal 10202 1727204041.47278: done checking for any_errors_fatal 10202 1727204041.47279: checking for max_fail_percentage 10202 1727204041.47280: done checking for max_fail_percentage 10202 1727204041.47281: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.47283: done checking to see if all hosts have failed 10202 1727204041.47284: getting the remaining hosts for this loop 10202 1727204041.47285: done getting the remaining hosts for this loop 10202 1727204041.47290: getting the next task for host managed-node3 10202 1727204041.47297: done getting next task for host managed-node3 10202 1727204041.47300: ^ task is: TASK: Install yum-utils package 10202 1727204041.47305: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.47310: getting variables 10202 1727204041.47312: in VariableManager get_vars() 10202 1727204041.47349: Calling all_inventory to load vars for managed-node3 10202 1727204041.47353: Calling groups_inventory to load vars for managed-node3 10202 1727204041.47358: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.47461: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.47465: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.47472: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.48344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.48800: done with get_vars() 10202 1727204041.48814: done getting variables 10202 1727204041.49163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204041.49197: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000fd 10202 1727204041.49200: WORKER PROCESS EXITING TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.052) 0:00:03.167 ***** 10202 1727204041.49214: entering _queue_task() for managed-node3/package 10202 1727204041.49216: Creating lock for package 10202 1727204041.49978: worker is 1 (out of 1 available) 10202 1727204041.49993: exiting _queue_task() for managed-node3/package 10202 1727204041.50007: done queuing things up, now waiting for results queue to drain 10202 1727204041.50009: waiting for pending results... 10202 1727204041.50545: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 10202 1727204041.50919: in run() - task 127b8e07-fff9-0b04-2570-0000000000fe 10202 1727204041.50943: variable 'ansible_search_path' from source: unknown 10202 1727204041.50950: variable 'ansible_search_path' from source: unknown 10202 1727204041.50998: calling self._execute() 10202 1727204041.51105: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.51247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.51263: variable 'omit' from source: magic vars 10202 1727204041.52160: variable 'ansible_distribution' from source: facts 10202 1727204041.52184: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10202 1727204041.52193: when evaluation is False, skipping this task 10202 1727204041.52202: _execute() done 10202 1727204041.52223: dumping result to json 10202 1727204041.52233: done dumping result, returning 10202 1727204041.52282: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [127b8e07-fff9-0b04-2570-0000000000fe] 10202 1727204041.52295: sending task result for task 127b8e07-fff9-0b04-2570-0000000000fe 10202 1727204041.52753: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000fe 10202 1727204041.52756: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10202 1727204041.52834: no more pending results, returning what we have 10202 1727204041.52838: results queue empty 10202 1727204041.52839: checking for any_errors_fatal 10202 1727204041.52850: done checking for any_errors_fatal 10202 1727204041.52851: checking for max_fail_percentage 10202 1727204041.52852: done checking for max_fail_percentage 10202 1727204041.52853: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.52854: done checking to see if all hosts have failed 10202 1727204041.52855: getting the remaining hosts for this loop 10202 1727204041.52857: done getting the remaining hosts for this loop 10202 1727204041.52862: getting the next task for host managed-node3 10202 1727204041.52872: done getting next task for host managed-node3 10202 1727204041.52874: ^ task is: TASK: Enable EPEL 7 10202 1727204041.52879: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.52882: getting variables 10202 1727204041.52883: in VariableManager get_vars() 10202 1727204041.52917: Calling all_inventory to load vars for managed-node3 10202 1727204041.52920: Calling groups_inventory to load vars for managed-node3 10202 1727204041.52923: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.52938: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.52941: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.52943: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.53313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.53725: done with get_vars() 10202 1727204041.53739: done getting variables 10202 1727204041.54015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.048) 0:00:03.215 ***** 10202 1727204041.54052: entering _queue_task() for managed-node3/command 10202 1727204041.54863: worker is 1 (out of 1 available) 10202 1727204041.54938: exiting _queue_task() for managed-node3/command 10202 1727204041.54950: done queuing things up, now waiting for results queue to drain 10202 1727204041.54951: waiting for pending results... 10202 1727204041.55250: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 10202 1727204041.55618: in run() - task 127b8e07-fff9-0b04-2570-0000000000ff 10202 1727204041.55640: variable 'ansible_search_path' from source: unknown 10202 1727204041.55648: variable 'ansible_search_path' from source: unknown 10202 1727204041.55696: calling self._execute() 10202 1727204041.55972: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.55988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.56003: variable 'omit' from source: magic vars 10202 1727204041.56846: variable 'ansible_distribution' from source: facts 10202 1727204041.56870: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10202 1727204041.57071: when evaluation is False, skipping this task 10202 1727204041.57075: _execute() done 10202 1727204041.57078: dumping result to json 10202 1727204041.57081: done dumping result, returning 10202 1727204041.57084: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [127b8e07-fff9-0b04-2570-0000000000ff] 10202 1727204041.57086: sending task result for task 127b8e07-fff9-0b04-2570-0000000000ff 10202 1727204041.57167: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000ff 10202 1727204041.57171: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10202 1727204041.57230: no more pending results, returning what we have 10202 1727204041.57234: results queue empty 10202 1727204041.57234: checking for any_errors_fatal 10202 1727204041.57241: done checking for any_errors_fatal 10202 1727204041.57241: checking for max_fail_percentage 10202 1727204041.57243: done checking for max_fail_percentage 10202 1727204041.57244: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.57245: done checking to see if all hosts have failed 10202 1727204041.57245: getting the remaining hosts for this loop 10202 1727204041.57247: done getting the remaining hosts for this loop 10202 1727204041.57251: getting the next task for host managed-node3 10202 1727204041.57258: done getting next task for host managed-node3 10202 1727204041.57261: ^ task is: TASK: Enable EPEL 8 10202 1727204041.57267: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.57271: getting variables 10202 1727204041.57272: in VariableManager get_vars() 10202 1727204041.57306: Calling all_inventory to load vars for managed-node3 10202 1727204041.57309: Calling groups_inventory to load vars for managed-node3 10202 1727204041.57312: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.57330: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.57332: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.57336: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.57888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.58415: done with get_vars() 10202 1727204041.58433: done getting variables 10202 1727204041.58572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.045) 0:00:03.261 ***** 10202 1727204041.58607: entering _queue_task() for managed-node3/command 10202 1727204041.59391: worker is 1 (out of 1 available) 10202 1727204041.59406: exiting _queue_task() for managed-node3/command 10202 1727204041.59421: done queuing things up, now waiting for results queue to drain 10202 1727204041.59422: waiting for pending results... 10202 1727204041.59886: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 10202 1727204041.60374: in run() - task 127b8e07-fff9-0b04-2570-000000000100 10202 1727204041.60380: variable 'ansible_search_path' from source: unknown 10202 1727204041.60384: variable 'ansible_search_path' from source: unknown 10202 1727204041.60387: calling self._execute() 10202 1727204041.60390: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.60394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.60396: variable 'omit' from source: magic vars 10202 1727204041.61179: variable 'ansible_distribution' from source: facts 10202 1727204041.61300: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10202 1727204041.61364: when evaluation is False, skipping this task 10202 1727204041.61376: _execute() done 10202 1727204041.61385: dumping result to json 10202 1727204041.61394: done dumping result, returning 10202 1727204041.61406: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [127b8e07-fff9-0b04-2570-000000000100] 10202 1727204041.61419: sending task result for task 127b8e07-fff9-0b04-2570-000000000100 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10202 1727204041.61583: no more pending results, returning what we have 10202 1727204041.61587: results queue empty 10202 1727204041.61588: checking for any_errors_fatal 10202 1727204041.61600: done checking for any_errors_fatal 10202 1727204041.61601: checking for max_fail_percentage 10202 1727204041.61603: done checking for max_fail_percentage 10202 1727204041.61604: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.61610: done checking to see if all hosts have failed 10202 1727204041.61611: getting the remaining hosts for this loop 10202 1727204041.61613: done getting the remaining hosts for this loop 10202 1727204041.61618: getting the next task for host managed-node3 10202 1727204041.61631: done getting next task for host managed-node3 10202 1727204041.61635: ^ task is: TASK: Enable EPEL 6 10202 1727204041.61640: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.61644: getting variables 10202 1727204041.61646: in VariableManager get_vars() 10202 1727204041.61686: Calling all_inventory to load vars for managed-node3 10202 1727204041.61689: Calling groups_inventory to load vars for managed-node3 10202 1727204041.61694: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.61840: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.61844: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.61848: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.62018: done sending task result for task 127b8e07-fff9-0b04-2570-000000000100 10202 1727204041.62021: WORKER PROCESS EXITING 10202 1727204041.62300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.62691: done with get_vars() 10202 1727204041.62710: done getting variables 10202 1727204041.62776: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.042) 0:00:03.303 ***** 10202 1727204041.62931: entering _queue_task() for managed-node3/copy 10202 1727204041.63591: worker is 1 (out of 1 available) 10202 1727204041.63722: exiting _queue_task() for managed-node3/copy 10202 1727204041.63739: done queuing things up, now waiting for results queue to drain 10202 1727204041.63740: waiting for pending results... 10202 1727204041.64212: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 10202 1727204041.64473: in run() - task 127b8e07-fff9-0b04-2570-000000000102 10202 1727204041.64480: variable 'ansible_search_path' from source: unknown 10202 1727204041.64483: variable 'ansible_search_path' from source: unknown 10202 1727204041.64819: calling self._execute() 10202 1727204041.64823: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.64826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.64828: variable 'omit' from source: magic vars 10202 1727204041.65671: variable 'ansible_distribution' from source: facts 10202 1727204041.65813: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10202 1727204041.65823: when evaluation is False, skipping this task 10202 1727204041.65832: _execute() done 10202 1727204041.65841: dumping result to json 10202 1727204041.65851: done dumping result, returning 10202 1727204041.65912: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [127b8e07-fff9-0b04-2570-000000000102] 10202 1727204041.65927: sending task result for task 127b8e07-fff9-0b04-2570-000000000102 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10202 1727204041.66226: no more pending results, returning what we have 10202 1727204041.66230: results queue empty 10202 1727204041.66231: checking for any_errors_fatal 10202 1727204041.66236: done checking for any_errors_fatal 10202 1727204041.66237: checking for max_fail_percentage 10202 1727204041.66238: done checking for max_fail_percentage 10202 1727204041.66239: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.66240: done checking to see if all hosts have failed 10202 1727204041.66241: getting the remaining hosts for this loop 10202 1727204041.66243: done getting the remaining hosts for this loop 10202 1727204041.66247: getting the next task for host managed-node3 10202 1727204041.66256: done getting next task for host managed-node3 10202 1727204041.66259: ^ task is: TASK: Set network provider to 'nm' 10202 1727204041.66262: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.66267: getting variables 10202 1727204041.66269: in VariableManager get_vars() 10202 1727204041.66306: Calling all_inventory to load vars for managed-node3 10202 1727204041.66309: Calling groups_inventory to load vars for managed-node3 10202 1727204041.66314: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.66446: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.66450: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.66455: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.66897: done sending task result for task 127b8e07-fff9-0b04-2570-000000000102 10202 1727204041.66902: WORKER PROCESS EXITING 10202 1727204041.66919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.67318: done with get_vars() 10202 1727204041.67334: done getting variables 10202 1727204041.67406: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.047) 0:00:03.350 ***** 10202 1727204041.67555: entering _queue_task() for managed-node3/set_fact 10202 1727204041.68249: worker is 1 (out of 1 available) 10202 1727204041.68399: exiting _queue_task() for managed-node3/set_fact 10202 1727204041.68415: done queuing things up, now waiting for results queue to drain 10202 1727204041.68417: waiting for pending results... 10202 1727204041.68885: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 10202 1727204041.69120: in run() - task 127b8e07-fff9-0b04-2570-000000000007 10202 1727204041.69125: variable 'ansible_search_path' from source: unknown 10202 1727204041.69129: calling self._execute() 10202 1727204041.69301: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.69315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.69348: variable 'omit' from source: magic vars 10202 1727204041.69772: variable 'omit' from source: magic vars 10202 1727204041.69777: variable 'omit' from source: magic vars 10202 1727204041.69780: variable 'omit' from source: magic vars 10202 1727204041.69926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204041.69973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204041.70170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204041.70173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.70175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.70178: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204041.70180: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.70256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.70648: Set connection var ansible_shell_type to sh 10202 1727204041.70652: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204041.70654: Set connection var ansible_connection to ssh 10202 1727204041.70656: Set connection var ansible_shell_executable to /bin/sh 10202 1727204041.70658: Set connection var ansible_pipelining to False 10202 1727204041.70660: Set connection var ansible_timeout to 10 10202 1727204041.70662: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.70664: variable 'ansible_connection' from source: unknown 10202 1727204041.70667: variable 'ansible_module_compression' from source: unknown 10202 1727204041.70669: variable 'ansible_shell_type' from source: unknown 10202 1727204041.70671: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.70673: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.70675: variable 'ansible_pipelining' from source: unknown 10202 1727204041.70677: variable 'ansible_timeout' from source: unknown 10202 1727204041.70679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.71035: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204041.71201: variable 'omit' from source: magic vars 10202 1727204041.71271: starting attempt loop 10202 1727204041.71275: running the handler 10202 1727204041.71277: handler run complete 10202 1727204041.71280: attempt loop complete, returning result 10202 1727204041.71282: _execute() done 10202 1727204041.71283: dumping result to json 10202 1727204041.71285: done dumping result, returning 10202 1727204041.71287: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [127b8e07-fff9-0b04-2570-000000000007] 10202 1727204041.71299: sending task result for task 127b8e07-fff9-0b04-2570-000000000007 10202 1727204041.71536: done sending task result for task 127b8e07-fff9-0b04-2570-000000000007 10202 1727204041.71541: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 10202 1727204041.71607: no more pending results, returning what we have 10202 1727204041.71611: results queue empty 10202 1727204041.71612: checking for any_errors_fatal 10202 1727204041.71618: done checking for any_errors_fatal 10202 1727204041.71619: checking for max_fail_percentage 10202 1727204041.71621: done checking for max_fail_percentage 10202 1727204041.71622: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.71623: done checking to see if all hosts have failed 10202 1727204041.71626: getting the remaining hosts for this loop 10202 1727204041.71628: done getting the remaining hosts for this loop 10202 1727204041.71633: getting the next task for host managed-node3 10202 1727204041.71641: done getting next task for host managed-node3 10202 1727204041.71643: ^ task is: TASK: meta (flush_handlers) 10202 1727204041.71645: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.71651: getting variables 10202 1727204041.71653: in VariableManager get_vars() 10202 1727204041.71692: Calling all_inventory to load vars for managed-node3 10202 1727204041.71695: Calling groups_inventory to load vars for managed-node3 10202 1727204041.71699: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.71713: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.71716: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.71719: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.72209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.72577: done with get_vars() 10202 1727204041.72591: done getting variables 10202 1727204041.72791: in VariableManager get_vars() 10202 1727204041.72804: Calling all_inventory to load vars for managed-node3 10202 1727204041.72807: Calling groups_inventory to load vars for managed-node3 10202 1727204041.72810: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.72815: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.72818: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.72821: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.73659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.74230: done with get_vars() 10202 1727204041.74255: done queuing things up, now waiting for results queue to drain 10202 1727204041.74258: results queue empty 10202 1727204041.74259: checking for any_errors_fatal 10202 1727204041.74262: done checking for any_errors_fatal 10202 1727204041.74263: checking for max_fail_percentage 10202 1727204041.74264: done checking for max_fail_percentage 10202 1727204041.74267: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.74268: done checking to see if all hosts have failed 10202 1727204041.74269: getting the remaining hosts for this loop 10202 1727204041.74270: done getting the remaining hosts for this loop 10202 1727204041.74274: getting the next task for host managed-node3 10202 1727204041.74278: done getting next task for host managed-node3 10202 1727204041.74280: ^ task is: TASK: meta (flush_handlers) 10202 1727204041.74282: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.74291: getting variables 10202 1727204041.74293: in VariableManager get_vars() 10202 1727204041.74304: Calling all_inventory to load vars for managed-node3 10202 1727204041.74306: Calling groups_inventory to load vars for managed-node3 10202 1727204041.74309: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.74316: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.74319: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.74322: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.74645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.75036: done with get_vars() 10202 1727204041.75049: done getting variables 10202 1727204041.75131: in VariableManager get_vars() 10202 1727204041.75143: Calling all_inventory to load vars for managed-node3 10202 1727204041.75145: Calling groups_inventory to load vars for managed-node3 10202 1727204041.75148: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.75153: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.75155: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.75158: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.75515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.75955: done with get_vars() 10202 1727204041.76090: done queuing things up, now waiting for results queue to drain 10202 1727204041.76093: results queue empty 10202 1727204041.76094: checking for any_errors_fatal 10202 1727204041.76095: done checking for any_errors_fatal 10202 1727204041.76096: checking for max_fail_percentage 10202 1727204041.76097: done checking for max_fail_percentage 10202 1727204041.76098: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.76099: done checking to see if all hosts have failed 10202 1727204041.76100: getting the remaining hosts for this loop 10202 1727204041.76101: done getting the remaining hosts for this loop 10202 1727204041.76104: getting the next task for host managed-node3 10202 1727204041.76108: done getting next task for host managed-node3 10202 1727204041.76108: ^ task is: None 10202 1727204041.76110: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.76111: done queuing things up, now waiting for results queue to drain 10202 1727204041.76112: results queue empty 10202 1727204041.76113: checking for any_errors_fatal 10202 1727204041.76114: done checking for any_errors_fatal 10202 1727204041.76114: checking for max_fail_percentage 10202 1727204041.76115: done checking for max_fail_percentage 10202 1727204041.76116: checking to see if all hosts have failed and the running result is not ok 10202 1727204041.76117: done checking to see if all hosts have failed 10202 1727204041.76118: getting the next task for host managed-node3 10202 1727204041.76121: done getting next task for host managed-node3 10202 1727204041.76121: ^ task is: None 10202 1727204041.76123: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.76283: in VariableManager get_vars() 10202 1727204041.76316: done with get_vars() 10202 1727204041.76322: in VariableManager get_vars() 10202 1727204041.76339: done with get_vars() 10202 1727204041.76344: variable 'omit' from source: magic vars 10202 1727204041.76378: in VariableManager get_vars() 10202 1727204041.76395: done with get_vars() 10202 1727204041.76563: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 10202 1727204041.77521: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 10202 1727204041.77557: getting the remaining hosts for this loop 10202 1727204041.77559: done getting the remaining hosts for this loop 10202 1727204041.77562: getting the next task for host managed-node3 10202 1727204041.77568: done getting next task for host managed-node3 10202 1727204041.77570: ^ task is: TASK: Gathering Facts 10202 1727204041.77572: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204041.77575: getting variables 10202 1727204041.77576: in VariableManager get_vars() 10202 1727204041.77592: Calling all_inventory to load vars for managed-node3 10202 1727204041.77600: Calling groups_inventory to load vars for managed-node3 10202 1727204041.77602: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204041.77610: Calling all_plugins_play to load vars for managed-node3 10202 1727204041.77629: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204041.77634: Calling groups_plugins_play to load vars for managed-node3 10202 1727204041.77796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204041.78011: done with get_vars() 10202 1727204041.78021: done getting variables 10202 1727204041.78082: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.105) 0:00:03.456 ***** 10202 1727204041.78109: entering _queue_task() for managed-node3/gather_facts 10202 1727204041.78590: worker is 1 (out of 1 available) 10202 1727204041.78602: exiting _queue_task() for managed-node3/gather_facts 10202 1727204041.78613: done queuing things up, now waiting for results queue to drain 10202 1727204041.78614: waiting for pending results... 10202 1727204041.78869: running TaskExecutor() for managed-node3/TASK: Gathering Facts 10202 1727204041.78916: in run() - task 127b8e07-fff9-0b04-2570-000000000128 10202 1727204041.78941: variable 'ansible_search_path' from source: unknown 10202 1727204041.78995: calling self._execute() 10202 1727204041.79150: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.79163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.79178: variable 'omit' from source: magic vars 10202 1727204041.79643: variable 'ansible_distribution_major_version' from source: facts 10202 1727204041.79747: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204041.79751: variable 'omit' from source: magic vars 10202 1727204041.79754: variable 'omit' from source: magic vars 10202 1727204041.79756: variable 'omit' from source: magic vars 10202 1727204041.79819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204041.79874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204041.79905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204041.79933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.79952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204041.80005: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204041.80015: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.80022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.80202: Set connection var ansible_shell_type to sh 10202 1727204041.80205: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204041.80208: Set connection var ansible_connection to ssh 10202 1727204041.80213: Set connection var ansible_shell_executable to /bin/sh 10202 1727204041.80215: Set connection var ansible_pipelining to False 10202 1727204041.80217: Set connection var ansible_timeout to 10 10202 1727204041.80246: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.80254: variable 'ansible_connection' from source: unknown 10202 1727204041.80260: variable 'ansible_module_compression' from source: unknown 10202 1727204041.80267: variable 'ansible_shell_type' from source: unknown 10202 1727204041.80274: variable 'ansible_shell_executable' from source: unknown 10202 1727204041.80279: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204041.80310: variable 'ansible_pipelining' from source: unknown 10202 1727204041.80312: variable 'ansible_timeout' from source: unknown 10202 1727204041.80315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204041.80685: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204041.80726: variable 'omit' from source: magic vars 10202 1727204041.80758: starting attempt loop 10202 1727204041.80761: running the handler 10202 1727204041.80868: variable 'ansible_facts' from source: unknown 10202 1727204041.80874: _low_level_execute_command(): starting 10202 1727204041.80877: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204041.82177: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204041.82289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204041.82322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204041.82342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204041.82370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.82487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204041.84433: stdout chunk (state=3): >>>/root <<< 10202 1727204041.84850: stdout chunk (state=3): >>><<< 10202 1727204041.84854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204041.84856: stderr chunk (state=3): >>><<< 10202 1727204041.84860: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204041.84862: _low_level_execute_command(): starting 10202 1727204041.84869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782 `" && echo ansible-tmp-1727204041.8481154-10536-25067283964782="` echo /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782 `" ) && sleep 0' 10202 1727204041.85899: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204041.85924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.86041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204041.88276: stdout chunk (state=3): >>>ansible-tmp-1727204041.8481154-10536-25067283964782=/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782 <<< 10202 1727204041.88474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204041.88484: stdout chunk (state=3): >>><<< 10202 1727204041.88496: stderr chunk (state=3): >>><<< 10202 1727204041.88517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204041.8481154-10536-25067283964782=/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204041.88561: variable 'ansible_module_compression' from source: unknown 10202 1727204041.88629: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10202 1727204041.88706: variable 'ansible_facts' from source: unknown 10202 1727204041.88939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py 10202 1727204041.89234: Sending initial data 10202 1727204041.89253: Sent initial data (153 bytes) 10202 1727204041.90008: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204041.90039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204041.90058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204041.90084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.90310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204041.92108: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10202 1727204041.92128: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204041.92222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204041.92312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3btvaz3s /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py <<< 10202 1727204041.92315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py" <<< 10202 1727204041.92385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3btvaz3s" to remote "/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py" <<< 10202 1727204041.94146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204041.94280: stderr chunk (state=3): >>><<< 10202 1727204041.94285: stdout chunk (state=3): >>><<< 10202 1727204041.94287: done transferring module to remote 10202 1727204041.94290: _low_level_execute_command(): starting 10202 1727204041.94292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/ /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py && sleep 0' 10202 1727204041.94968: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204041.94989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204041.95008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204041.95053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204041.95071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204041.95161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204041.95180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204041.95204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.95505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204041.97663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204041.97670: stdout chunk (state=3): >>><<< 10202 1727204041.97677: stderr chunk (state=3): >>><<< 10202 1727204041.97702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204041.97706: _low_level_execute_command(): starting 10202 1727204041.97709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/AnsiballZ_setup.py && sleep 0' 10202 1727204041.98424: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204041.98660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204041.98664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204041.98668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204041.98670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204041.98672: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204041.98673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204041.98675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204041.98677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204041.98679: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204041.98681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204041.98683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204041.98758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204042.65460: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "02", "epoch": "1727204042", "epoch_int": "1727204042", "date": "2024-09-24", "time": "14:54:02", "iso8601_micro": "2024-09-24T18:54:02.311373Z", "iso8601": "2024-09-24T18:54:02Z", "iso8601_basic": "20240924T145402311373", "iso8601_basic_short": "20240924T145402", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.41259765625, "5m": 0.3759765625, "15m": 0.1826171875}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "<<< 10202 1727204042.65468: stdout chunk (state=3): >>>ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3087, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 629, "free": 3087}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391",<<< 10202 1727204042.65505: stdout chunk (state=3): >>> "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 379, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251335196672, "block_size": 4096, "block_total": 64479564, "block_available": 61361132, "block_used": 3118432, "inode_total": 16384000, "inode_available": 16301595, "inode_used": 82405, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10202 1727204042.67928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204042.67973: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 10202 1727204042.67976: stdout chunk (state=3): >>><<< 10202 1727204042.67979: stderr chunk (state=3): >>><<< 10202 1727204042.68075: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "02", "epoch": "1727204042", "epoch_int": "1727204042", "date": "2024-09-24", "time": "14:54:02", "iso8601_micro": "2024-09-24T18:54:02.311373Z", "iso8601": "2024-09-24T18:54:02Z", "iso8601_basic": "20240924T145402311373", "iso8601_basic_short": "20240924T145402", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.41259765625, "5m": 0.3759765625, "15m": 0.1826171875}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3087, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 629, "free": 3087}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 379, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251335196672, "block_size": 4096, "block_total": 64479564, "block_available": 61361132, "block_used": 3118432, "inode_total": 16384000, "inode_available": 16301595, "inode_used": 82405, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204042.68472: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204042.68476: _low_level_execute_command(): starting 10202 1727204042.68481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204041.8481154-10536-25067283964782/ > /dev/null 2>&1 && sleep 0' 10202 1727204042.69211: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204042.69243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204042.69259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204042.69285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204042.69352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204042.69413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204042.69440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204042.69483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204042.69599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204042.71737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204042.71740: stdout chunk (state=3): >>><<< 10202 1727204042.71742: stderr chunk (state=3): >>><<< 10202 1727204042.71759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204042.71776: handler run complete 10202 1727204042.72072: variable 'ansible_facts' from source: unknown 10202 1727204042.72077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.72387: variable 'ansible_facts' from source: unknown 10202 1727204042.72488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.72632: attempt loop complete, returning result 10202 1727204042.72643: _execute() done 10202 1727204042.72651: dumping result to json 10202 1727204042.72684: done dumping result, returning 10202 1727204042.72699: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-0b04-2570-000000000128] 10202 1727204042.72710: sending task result for task 127b8e07-fff9-0b04-2570-000000000128 10202 1727204042.73273: done sending task result for task 127b8e07-fff9-0b04-2570-000000000128 10202 1727204042.73277: WORKER PROCESS EXITING ok: [managed-node3] 10202 1727204042.73636: no more pending results, returning what we have 10202 1727204042.73639: results queue empty 10202 1727204042.73640: checking for any_errors_fatal 10202 1727204042.73642: done checking for any_errors_fatal 10202 1727204042.73643: checking for max_fail_percentage 10202 1727204042.73644: done checking for max_fail_percentage 10202 1727204042.73645: checking to see if all hosts have failed and the running result is not ok 10202 1727204042.73646: done checking to see if all hosts have failed 10202 1727204042.73647: getting the remaining hosts for this loop 10202 1727204042.73648: done getting the remaining hosts for this loop 10202 1727204042.73652: getting the next task for host managed-node3 10202 1727204042.73658: done getting next task for host managed-node3 10202 1727204042.73660: ^ task is: TASK: meta (flush_handlers) 10202 1727204042.73662: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204042.73668: getting variables 10202 1727204042.73670: in VariableManager get_vars() 10202 1727204042.73703: Calling all_inventory to load vars for managed-node3 10202 1727204042.73706: Calling groups_inventory to load vars for managed-node3 10202 1727204042.73709: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204042.73720: Calling all_plugins_play to load vars for managed-node3 10202 1727204042.73723: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204042.73730: Calling groups_plugins_play to load vars for managed-node3 10202 1727204042.73908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.74135: done with get_vars() 10202 1727204042.74147: done getting variables 10202 1727204042.74228: in VariableManager get_vars() 10202 1727204042.74245: Calling all_inventory to load vars for managed-node3 10202 1727204042.74248: Calling groups_inventory to load vars for managed-node3 10202 1727204042.74250: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204042.74255: Calling all_plugins_play to load vars for managed-node3 10202 1727204042.74258: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204042.74261: Calling groups_plugins_play to load vars for managed-node3 10202 1727204042.74411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.74619: done with get_vars() 10202 1727204042.74637: done queuing things up, now waiting for results queue to drain 10202 1727204042.74639: results queue empty 10202 1727204042.74640: checking for any_errors_fatal 10202 1727204042.74644: done checking for any_errors_fatal 10202 1727204042.74649: checking for max_fail_percentage 10202 1727204042.74650: done checking for max_fail_percentage 10202 1727204042.74651: checking to see if all hosts have failed and the running result is not ok 10202 1727204042.74652: done checking to see if all hosts have failed 10202 1727204042.74653: getting the remaining hosts for this loop 10202 1727204042.74654: done getting the remaining hosts for this loop 10202 1727204042.74656: getting the next task for host managed-node3 10202 1727204042.74660: done getting next task for host managed-node3 10202 1727204042.74662: ^ task is: TASK: INIT Prepare setup 10202 1727204042.74664: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204042.74668: getting variables 10202 1727204042.74669: in VariableManager get_vars() 10202 1727204042.74682: Calling all_inventory to load vars for managed-node3 10202 1727204042.74684: Calling groups_inventory to load vars for managed-node3 10202 1727204042.74686: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204042.74691: Calling all_plugins_play to load vars for managed-node3 10202 1727204042.74693: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204042.74696: Calling groups_plugins_play to load vars for managed-node3 10202 1727204042.74847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.75063: done with get_vars() 10202 1727204042.75075: done getting variables 10202 1727204042.75155: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.970) 0:00:04.427 ***** 10202 1727204042.75185: entering _queue_task() for managed-node3/debug 10202 1727204042.75187: Creating lock for debug 10202 1727204042.75543: worker is 1 (out of 1 available) 10202 1727204042.75558: exiting _queue_task() for managed-node3/debug 10202 1727204042.75577: done queuing things up, now waiting for results queue to drain 10202 1727204042.75579: waiting for pending results... 10202 1727204042.75857: running TaskExecutor() for managed-node3/TASK: INIT Prepare setup 10202 1727204042.75962: in run() - task 127b8e07-fff9-0b04-2570-00000000000b 10202 1727204042.75987: variable 'ansible_search_path' from source: unknown 10202 1727204042.76033: calling self._execute() 10202 1727204042.76138: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.76161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.76179: variable 'omit' from source: magic vars 10202 1727204042.76876: variable 'ansible_distribution_major_version' from source: facts 10202 1727204042.76879: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204042.76881: variable 'omit' from source: magic vars 10202 1727204042.76918: variable 'omit' from source: magic vars 10202 1727204042.76969: variable 'omit' from source: magic vars 10202 1727204042.77015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204042.77062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204042.77093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204042.77116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204042.77135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204042.77173: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204042.77181: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.77188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.77299: Set connection var ansible_shell_type to sh 10202 1727204042.77312: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204042.77323: Set connection var ansible_connection to ssh 10202 1727204042.77336: Set connection var ansible_shell_executable to /bin/sh 10202 1727204042.77347: Set connection var ansible_pipelining to False 10202 1727204042.77357: Set connection var ansible_timeout to 10 10202 1727204042.77390: variable 'ansible_shell_executable' from source: unknown 10202 1727204042.77400: variable 'ansible_connection' from source: unknown 10202 1727204042.77407: variable 'ansible_module_compression' from source: unknown 10202 1727204042.77415: variable 'ansible_shell_type' from source: unknown 10202 1727204042.77426: variable 'ansible_shell_executable' from source: unknown 10202 1727204042.77435: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.77443: variable 'ansible_pipelining' from source: unknown 10202 1727204042.77450: variable 'ansible_timeout' from source: unknown 10202 1727204042.77459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.77613: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204042.77635: variable 'omit' from source: magic vars 10202 1727204042.77770: starting attempt loop 10202 1727204042.77774: running the handler 10202 1727204042.77776: handler run complete 10202 1727204042.77779: attempt loop complete, returning result 10202 1727204042.77781: _execute() done 10202 1727204042.77783: dumping result to json 10202 1727204042.77785: done dumping result, returning 10202 1727204042.77788: done running TaskExecutor() for managed-node3/TASK: INIT Prepare setup [127b8e07-fff9-0b04-2570-00000000000b] 10202 1727204042.77791: sending task result for task 127b8e07-fff9-0b04-2570-00000000000b ok: [managed-node3] => {} MSG: ################################################## 10202 1727204042.77930: no more pending results, returning what we have 10202 1727204042.77932: results queue empty 10202 1727204042.77933: checking for any_errors_fatal 10202 1727204042.77935: done checking for any_errors_fatal 10202 1727204042.77936: checking for max_fail_percentage 10202 1727204042.77937: done checking for max_fail_percentage 10202 1727204042.77938: checking to see if all hosts have failed and the running result is not ok 10202 1727204042.77939: done checking to see if all hosts have failed 10202 1727204042.77940: getting the remaining hosts for this loop 10202 1727204042.77941: done getting the remaining hosts for this loop 10202 1727204042.77945: getting the next task for host managed-node3 10202 1727204042.77952: done getting next task for host managed-node3 10202 1727204042.77955: ^ task is: TASK: Install dnsmasq 10202 1727204042.77962: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204042.77966: getting variables 10202 1727204042.77970: in VariableManager get_vars() 10202 1727204042.78194: Calling all_inventory to load vars for managed-node3 10202 1727204042.78197: Calling groups_inventory to load vars for managed-node3 10202 1727204042.78201: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204042.78207: done sending task result for task 127b8e07-fff9-0b04-2570-00000000000b 10202 1727204042.78210: WORKER PROCESS EXITING 10202 1727204042.78219: Calling all_plugins_play to load vars for managed-node3 10202 1727204042.78222: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204042.78225: Calling groups_plugins_play to load vars for managed-node3 10202 1727204042.78397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204042.78614: done with get_vars() 10202 1727204042.78626: done getting variables 10202 1727204042.78688: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.035) 0:00:04.462 ***** 10202 1727204042.78726: entering _queue_task() for managed-node3/package 10202 1727204042.79033: worker is 1 (out of 1 available) 10202 1727204042.79053: exiting _queue_task() for managed-node3/package 10202 1727204042.79067: done queuing things up, now waiting for results queue to drain 10202 1727204042.79069: waiting for pending results... 10202 1727204042.79291: running TaskExecutor() for managed-node3/TASK: Install dnsmasq 10202 1727204042.79473: in run() - task 127b8e07-fff9-0b04-2570-00000000000f 10202 1727204042.79479: variable 'ansible_search_path' from source: unknown 10202 1727204042.79482: variable 'ansible_search_path' from source: unknown 10202 1727204042.79511: calling self._execute() 10202 1727204042.79614: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.79630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.79645: variable 'omit' from source: magic vars 10202 1727204042.80171: variable 'ansible_distribution_major_version' from source: facts 10202 1727204042.80176: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204042.80178: variable 'omit' from source: magic vars 10202 1727204042.80180: variable 'omit' from source: magic vars 10202 1727204042.80380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204042.82517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204042.82600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204042.82648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204042.82705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204042.82738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204042.82853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204042.82889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204042.82921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204042.82975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204042.82997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204042.83170: variable '__network_is_ostree' from source: set_fact 10202 1727204042.83173: variable 'omit' from source: magic vars 10202 1727204042.83176: variable 'omit' from source: magic vars 10202 1727204042.83211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204042.83247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204042.83274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204042.83296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204042.83313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204042.83352: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204042.83361: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.83572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.83575: Set connection var ansible_shell_type to sh 10202 1727204042.83577: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204042.83579: Set connection var ansible_connection to ssh 10202 1727204042.83581: Set connection var ansible_shell_executable to /bin/sh 10202 1727204042.83583: Set connection var ansible_pipelining to False 10202 1727204042.83585: Set connection var ansible_timeout to 10 10202 1727204042.83587: variable 'ansible_shell_executable' from source: unknown 10202 1727204042.83589: variable 'ansible_connection' from source: unknown 10202 1727204042.83591: variable 'ansible_module_compression' from source: unknown 10202 1727204042.83593: variable 'ansible_shell_type' from source: unknown 10202 1727204042.83594: variable 'ansible_shell_executable' from source: unknown 10202 1727204042.83596: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204042.83598: variable 'ansible_pipelining' from source: unknown 10202 1727204042.83600: variable 'ansible_timeout' from source: unknown 10202 1727204042.83601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204042.83693: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204042.83710: variable 'omit' from source: magic vars 10202 1727204042.83720: starting attempt loop 10202 1727204042.83730: running the handler 10202 1727204042.83741: variable 'ansible_facts' from source: unknown 10202 1727204042.83749: variable 'ansible_facts' from source: unknown 10202 1727204042.83792: _low_level_execute_command(): starting 10202 1727204042.83804: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204042.84590: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204042.84629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204042.84649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204042.84678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204042.84793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204042.86632: stdout chunk (state=3): >>>/root <<< 10202 1727204042.86827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204042.86848: stdout chunk (state=3): >>><<< 10202 1727204042.86863: stderr chunk (state=3): >>><<< 10202 1727204042.86892: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204042.86920: _low_level_execute_command(): starting 10202 1727204042.86935: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669 `" && echo ansible-tmp-1727204042.86907-10576-132794806923669="` echo /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669 `" ) && sleep 0' 10202 1727204042.87651: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204042.87761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204042.87778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204042.87804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204042.87923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204042.90142: stdout chunk (state=3): >>>ansible-tmp-1727204042.86907-10576-132794806923669=/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669 <<< 10202 1727204042.90253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204042.90332: stderr chunk (state=3): >>><<< 10202 1727204042.90336: stdout chunk (state=3): >>><<< 10202 1727204042.90357: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204042.86907-10576-132794806923669=/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204042.90473: variable 'ansible_module_compression' from source: unknown 10202 1727204042.90490: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 10202 1727204042.90504: ANSIBALLZ: Acquiring lock 10202 1727204042.90513: ANSIBALLZ: Lock acquired: 140045305564624 10202 1727204042.90522: ANSIBALLZ: Creating module 10202 1727204043.05719: ANSIBALLZ: Writing module into payload 10202 1727204043.05866: ANSIBALLZ: Writing module 10202 1727204043.05888: ANSIBALLZ: Renaming module 10202 1727204043.05894: ANSIBALLZ: Done creating module 10202 1727204043.05913: variable 'ansible_facts' from source: unknown 10202 1727204043.05999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py 10202 1727204043.06337: Sending initial data 10202 1727204043.06341: Sent initial data (150 bytes) 10202 1727204043.06887: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204043.06907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204043.06921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204043.07026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204043.08850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204043.08913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204043.08977: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpsb13zf7u /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py <<< 10202 1727204043.08990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py" <<< 10202 1727204043.09049: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpsb13zf7u" to remote "/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py" <<< 10202 1727204043.09060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py" <<< 10202 1727204043.09927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204043.09999: stderr chunk (state=3): >>><<< 10202 1727204043.10003: stdout chunk (state=3): >>><<< 10202 1727204043.10027: done transferring module to remote 10202 1727204043.10036: _low_level_execute_command(): starting 10202 1727204043.10042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/ /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py && sleep 0' 10202 1727204043.10550: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204043.10554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204043.10557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204043.10560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204043.10563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204043.10617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204043.10621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204043.10626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204043.10697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204043.12748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204043.12810: stderr chunk (state=3): >>><<< 10202 1727204043.12814: stdout chunk (state=3): >>><<< 10202 1727204043.12832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204043.12835: _low_level_execute_command(): starting 10202 1727204043.12840: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/AnsiballZ_dnf.py && sleep 0' 10202 1727204043.13335: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204043.13339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204043.13342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204043.13344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204043.13393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204043.13396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204043.13399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204043.13482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.36213: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10202 1727204044.41829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204044.41835: stdout chunk (state=3): >>><<< 10202 1727204044.41838: stderr chunk (state=3): >>><<< 10202 1727204044.41841: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204044.41843: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204044.41852: _low_level_execute_command(): starting 10202 1727204044.41855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204042.86907-10576-132794806923669/ > /dev/null 2>&1 && sleep 0' 10202 1727204044.43337: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204044.43390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204044.43570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.43850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.43945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.46144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204044.46241: stderr chunk (state=3): >>><<< 10202 1727204044.46257: stdout chunk (state=3): >>><<< 10202 1727204044.46470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204044.46475: handler run complete 10202 1727204044.46774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204044.47185: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204044.47359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204044.47403: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204044.47476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204044.47675: variable '__install_status' from source: unknown 10202 1727204044.47705: Evaluated conditional (__install_status is success): True 10202 1727204044.47730: attempt loop complete, returning result 10202 1727204044.47780: _execute() done 10202 1727204044.47789: dumping result to json 10202 1727204044.47805: done dumping result, returning 10202 1727204044.47992: done running TaskExecutor() for managed-node3/TASK: Install dnsmasq [127b8e07-fff9-0b04-2570-00000000000f] 10202 1727204044.47996: sending task result for task 127b8e07-fff9-0b04-2570-00000000000f 10202 1727204044.48081: done sending task result for task 127b8e07-fff9-0b04-2570-00000000000f 10202 1727204044.48085: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10202 1727204044.48188: no more pending results, returning what we have 10202 1727204044.48192: results queue empty 10202 1727204044.48193: checking for any_errors_fatal 10202 1727204044.48199: done checking for any_errors_fatal 10202 1727204044.48200: checking for max_fail_percentage 10202 1727204044.48202: done checking for max_fail_percentage 10202 1727204044.48203: checking to see if all hosts have failed and the running result is not ok 10202 1727204044.48205: done checking to see if all hosts have failed 10202 1727204044.48205: getting the remaining hosts for this loop 10202 1727204044.48207: done getting the remaining hosts for this loop 10202 1727204044.48211: getting the next task for host managed-node3 10202 1727204044.48217: done getting next task for host managed-node3 10202 1727204044.48220: ^ task is: TASK: Install pgrep, sysctl 10202 1727204044.48223: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204044.48226: getting variables 10202 1727204044.48230: in VariableManager get_vars() 10202 1727204044.48498: Calling all_inventory to load vars for managed-node3 10202 1727204044.48502: Calling groups_inventory to load vars for managed-node3 10202 1727204044.48505: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204044.48517: Calling all_plugins_play to load vars for managed-node3 10202 1727204044.48521: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204044.48524: Calling groups_plugins_play to load vars for managed-node3 10202 1727204044.49183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204044.49819: done with get_vars() 10202 1727204044.49837: done getting variables 10202 1727204044.49952: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:04 -0400 (0:00:01.714) 0:00:06.176 ***** 10202 1727204044.50152: entering _queue_task() for managed-node3/package 10202 1727204044.50900: worker is 1 (out of 1 available) 10202 1727204044.50913: exiting _queue_task() for managed-node3/package 10202 1727204044.50924: done queuing things up, now waiting for results queue to drain 10202 1727204044.50926: waiting for pending results... 10202 1727204044.51584: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 10202 1727204044.51835: in run() - task 127b8e07-fff9-0b04-2570-000000000010 10202 1727204044.51840: variable 'ansible_search_path' from source: unknown 10202 1727204044.51843: variable 'ansible_search_path' from source: unknown 10202 1727204044.51846: calling self._execute() 10202 1727204044.51992: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204044.52005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204044.52021: variable 'omit' from source: magic vars 10202 1727204044.52916: variable 'ansible_distribution_major_version' from source: facts 10202 1727204044.52995: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204044.53371: variable 'ansible_os_family' from source: facts 10202 1727204044.53376: Evaluated conditional (ansible_os_family == 'RedHat'): True 10202 1727204044.53572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204044.54173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204044.54203: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204044.54332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204044.54375: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204044.54571: variable 'ansible_distribution_major_version' from source: facts 10202 1727204044.54625: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 10202 1727204044.54733: when evaluation is False, skipping this task 10202 1727204044.54836: _execute() done 10202 1727204044.54839: dumping result to json 10202 1727204044.54842: done dumping result, returning 10202 1727204044.54845: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [127b8e07-fff9-0b04-2570-000000000010] 10202 1727204044.54848: sending task result for task 127b8e07-fff9-0b04-2570-000000000010 10202 1727204044.54931: done sending task result for task 127b8e07-fff9-0b04-2570-000000000010 10202 1727204044.54935: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 10202 1727204044.54992: no more pending results, returning what we have 10202 1727204044.54996: results queue empty 10202 1727204044.54997: checking for any_errors_fatal 10202 1727204044.55008: done checking for any_errors_fatal 10202 1727204044.55009: checking for max_fail_percentage 10202 1727204044.55011: done checking for max_fail_percentage 10202 1727204044.55012: checking to see if all hosts have failed and the running result is not ok 10202 1727204044.55013: done checking to see if all hosts have failed 10202 1727204044.55014: getting the remaining hosts for this loop 10202 1727204044.55016: done getting the remaining hosts for this loop 10202 1727204044.55021: getting the next task for host managed-node3 10202 1727204044.55031: done getting next task for host managed-node3 10202 1727204044.55034: ^ task is: TASK: Install pgrep, sysctl 10202 1727204044.55037: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204044.55041: getting variables 10202 1727204044.55042: in VariableManager get_vars() 10202 1727204044.55089: Calling all_inventory to load vars for managed-node3 10202 1727204044.55092: Calling groups_inventory to load vars for managed-node3 10202 1727204044.55095: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204044.55107: Calling all_plugins_play to load vars for managed-node3 10202 1727204044.55109: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204044.55113: Calling groups_plugins_play to load vars for managed-node3 10202 1727204044.55511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204044.56102: done with get_vars() 10202 1727204044.56116: done getting variables 10202 1727204044.56471: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:04 -0400 (0:00:00.063) 0:00:06.240 ***** 10202 1727204044.56507: entering _queue_task() for managed-node3/package 10202 1727204044.57573: worker is 1 (out of 1 available) 10202 1727204044.57586: exiting _queue_task() for managed-node3/package 10202 1727204044.57596: done queuing things up, now waiting for results queue to drain 10202 1727204044.57597: waiting for pending results... 10202 1727204044.57701: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 10202 1727204044.57948: in run() - task 127b8e07-fff9-0b04-2570-000000000011 10202 1727204044.58121: variable 'ansible_search_path' from source: unknown 10202 1727204044.58125: variable 'ansible_search_path' from source: unknown 10202 1727204044.58131: calling self._execute() 10202 1727204044.58448: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204044.58452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204044.58455: variable 'omit' from source: magic vars 10202 1727204044.59249: variable 'ansible_distribution_major_version' from source: facts 10202 1727204044.59344: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204044.59606: variable 'ansible_os_family' from source: facts 10202 1727204044.59660: Evaluated conditional (ansible_os_family == 'RedHat'): True 10202 1727204044.60086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204044.60914: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204044.61174: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204044.61178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204044.61182: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204044.61501: variable 'ansible_distribution_major_version' from source: facts 10202 1727204044.61505: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 10202 1727204044.61509: variable 'omit' from source: magic vars 10202 1727204044.61671: variable 'omit' from source: magic vars 10202 1727204044.61949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204044.66875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204044.67082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204044.67135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204044.67226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204044.67325: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204044.67550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204044.67587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204044.67684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204044.67869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204044.67873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204044.68091: variable '__network_is_ostree' from source: set_fact 10202 1727204044.68199: variable 'omit' from source: magic vars 10202 1727204044.68216: variable 'omit' from source: magic vars 10202 1727204044.68254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204044.68341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204044.68404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204044.68550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204044.68635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204044.68639: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204044.68642: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204044.68644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204044.68969: Set connection var ansible_shell_type to sh 10202 1727204044.68984: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204044.68995: Set connection var ansible_connection to ssh 10202 1727204044.69004: Set connection var ansible_shell_executable to /bin/sh 10202 1727204044.69014: Set connection var ansible_pipelining to False 10202 1727204044.69023: Set connection var ansible_timeout to 10 10202 1727204044.69059: variable 'ansible_shell_executable' from source: unknown 10202 1727204044.69070: variable 'ansible_connection' from source: unknown 10202 1727204044.69083: variable 'ansible_module_compression' from source: unknown 10202 1727204044.69090: variable 'ansible_shell_type' from source: unknown 10202 1727204044.69097: variable 'ansible_shell_executable' from source: unknown 10202 1727204044.69105: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204044.69113: variable 'ansible_pipelining' from source: unknown 10202 1727204044.69121: variable 'ansible_timeout' from source: unknown 10202 1727204044.69133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204044.69409: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204044.69430: variable 'omit' from source: magic vars 10202 1727204044.69441: starting attempt loop 10202 1727204044.69514: running the handler 10202 1727204044.69517: variable 'ansible_facts' from source: unknown 10202 1727204044.69520: variable 'ansible_facts' from source: unknown 10202 1727204044.69735: _low_level_execute_command(): starting 10202 1727204044.69738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204044.70749: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204044.70762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204044.70874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204044.70885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204044.70940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.70943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.71013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.72903: stdout chunk (state=3): >>>/root <<< 10202 1727204044.73119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204044.73130: stderr chunk (state=3): >>><<< 10202 1727204044.73272: stdout chunk (state=3): >>><<< 10202 1727204044.73276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204044.73283: _low_level_execute_command(): starting 10202 1727204044.73290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546 `" && echo ansible-tmp-1727204044.7326982-10744-66964192267546="` echo /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546 `" ) && sleep 0' 10202 1727204044.74774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204044.74986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204044.74991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204044.75105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204044.75137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.75216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.75352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.77556: stdout chunk (state=3): >>>ansible-tmp-1727204044.7326982-10744-66964192267546=/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546 <<< 10202 1727204044.77785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204044.77909: stderr chunk (state=3): >>><<< 10202 1727204044.77918: stdout chunk (state=3): >>><<< 10202 1727204044.77981: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204044.7326982-10744-66964192267546=/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204044.78218: variable 'ansible_module_compression' from source: unknown 10202 1727204044.78221: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10202 1727204044.78311: variable 'ansible_facts' from source: unknown 10202 1727204044.78568: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py 10202 1727204044.78950: Sending initial data 10202 1727204044.78961: Sent initial data (151 bytes) 10202 1727204044.80489: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204044.80609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.80811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.81186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.82981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204044.83102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204044.83128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py" <<< 10202 1727204044.83132: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpjouttfdd /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py <<< 10202 1727204044.83329: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpjouttfdd" to remote "/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py" <<< 10202 1727204044.85723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204044.85935: stderr chunk (state=3): >>><<< 10202 1727204044.85946: stdout chunk (state=3): >>><<< 10202 1727204044.86123: done transferring module to remote 10202 1727204044.86129: _low_level_execute_command(): starting 10202 1727204044.86132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/ /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py && sleep 0' 10202 1727204044.87558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204044.87776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.87825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.87892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204044.90065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204044.90081: stdout chunk (state=3): >>><<< 10202 1727204044.90297: stderr chunk (state=3): >>><<< 10202 1727204044.90302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204044.90304: _low_level_execute_command(): starting 10202 1727204044.90308: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/AnsiballZ_dnf.py && sleep 0' 10202 1727204044.91496: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204044.91764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204044.91887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204044.92012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.12892: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10202 1727204046.18826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204046.18836: stdout chunk (state=3): >>><<< 10202 1727204046.18839: stderr chunk (state=3): >>><<< 10202 1727204046.18863: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204046.18934: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204046.18952: _low_level_execute_command(): starting 10202 1727204046.18968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204044.7326982-10744-66964192267546/ > /dev/null 2>&1 && sleep 0' 10202 1727204046.19689: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204046.19752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204046.19811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204046.19829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204046.19891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.22173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204046.22177: stdout chunk (state=3): >>><<< 10202 1727204046.22180: stderr chunk (state=3): >>><<< 10202 1727204046.22182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204046.22185: handler run complete 10202 1727204046.22187: attempt loop complete, returning result 10202 1727204046.22189: _execute() done 10202 1727204046.22191: dumping result to json 10202 1727204046.22193: done dumping result, returning 10202 1727204046.22195: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [127b8e07-fff9-0b04-2570-000000000011] 10202 1727204046.22197: sending task result for task 127b8e07-fff9-0b04-2570-000000000011 ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10202 1727204046.22358: no more pending results, returning what we have 10202 1727204046.22362: results queue empty 10202 1727204046.22363: checking for any_errors_fatal 10202 1727204046.22478: done checking for any_errors_fatal 10202 1727204046.22479: checking for max_fail_percentage 10202 1727204046.22481: done checking for max_fail_percentage 10202 1727204046.22482: checking to see if all hosts have failed and the running result is not ok 10202 1727204046.22483: done checking to see if all hosts have failed 10202 1727204046.22484: getting the remaining hosts for this loop 10202 1727204046.22485: done getting the remaining hosts for this loop 10202 1727204046.22489: getting the next task for host managed-node3 10202 1727204046.22495: done getting next task for host managed-node3 10202 1727204046.22498: ^ task is: TASK: Create test interfaces 10202 1727204046.22501: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204046.22504: getting variables 10202 1727204046.22505: in VariableManager get_vars() 10202 1727204046.22541: Calling all_inventory to load vars for managed-node3 10202 1727204046.22544: Calling groups_inventory to load vars for managed-node3 10202 1727204046.22546: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204046.22611: done sending task result for task 127b8e07-fff9-0b04-2570-000000000011 10202 1727204046.22615: WORKER PROCESS EXITING 10202 1727204046.22625: Calling all_plugins_play to load vars for managed-node3 10202 1727204046.22629: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204046.22633: Calling groups_plugins_play to load vars for managed-node3 10202 1727204046.22860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204046.23081: done with get_vars() 10202 1727204046.23094: done getting variables 10202 1727204046.23191: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:06 -0400 (0:00:01.667) 0:00:07.907 ***** 10202 1727204046.23226: entering _queue_task() for managed-node3/shell 10202 1727204046.23228: Creating lock for shell 10202 1727204046.23581: worker is 1 (out of 1 available) 10202 1727204046.23596: exiting _queue_task() for managed-node3/shell 10202 1727204046.23611: done queuing things up, now waiting for results queue to drain 10202 1727204046.23612: waiting for pending results... 10202 1727204046.24086: running TaskExecutor() for managed-node3/TASK: Create test interfaces 10202 1727204046.24092: in run() - task 127b8e07-fff9-0b04-2570-000000000012 10202 1727204046.24096: variable 'ansible_search_path' from source: unknown 10202 1727204046.24098: variable 'ansible_search_path' from source: unknown 10202 1727204046.24101: calling self._execute() 10202 1727204046.24187: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204046.24201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204046.24223: variable 'omit' from source: magic vars 10202 1727204046.24648: variable 'ansible_distribution_major_version' from source: facts 10202 1727204046.24674: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204046.24687: variable 'omit' from source: magic vars 10202 1727204046.24744: variable 'omit' from source: magic vars 10202 1727204046.25176: variable 'dhcp_interface1' from source: play vars 10202 1727204046.25187: variable 'dhcp_interface2' from source: play vars 10202 1727204046.25232: variable 'omit' from source: magic vars 10202 1727204046.25284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204046.25332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204046.25357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204046.25383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204046.25400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204046.25525: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204046.25529: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204046.25531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204046.25575: Set connection var ansible_shell_type to sh 10202 1727204046.25588: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204046.25598: Set connection var ansible_connection to ssh 10202 1727204046.25610: Set connection var ansible_shell_executable to /bin/sh 10202 1727204046.25622: Set connection var ansible_pipelining to False 10202 1727204046.25637: Set connection var ansible_timeout to 10 10202 1727204046.25672: variable 'ansible_shell_executable' from source: unknown 10202 1727204046.25681: variable 'ansible_connection' from source: unknown 10202 1727204046.25689: variable 'ansible_module_compression' from source: unknown 10202 1727204046.25696: variable 'ansible_shell_type' from source: unknown 10202 1727204046.25704: variable 'ansible_shell_executable' from source: unknown 10202 1727204046.25711: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204046.25719: variable 'ansible_pipelining' from source: unknown 10202 1727204046.25726: variable 'ansible_timeout' from source: unknown 10202 1727204046.25738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204046.25904: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204046.25963: variable 'omit' from source: magic vars 10202 1727204046.25966: starting attempt loop 10202 1727204046.25970: running the handler 10202 1727204046.25978: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204046.25983: _low_level_execute_command(): starting 10202 1727204046.25997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204046.26853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204046.26900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204046.26917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204046.26950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204046.27075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.28906: stdout chunk (state=3): >>>/root <<< 10202 1727204046.29114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204046.29118: stdout chunk (state=3): >>><<< 10202 1727204046.29121: stderr chunk (state=3): >>><<< 10202 1727204046.29264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204046.29272: _low_level_execute_command(): starting 10202 1727204046.29275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317 `" && echo ansible-tmp-1727204046.2915313-10796-87766365300317="` echo /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317 `" ) && sleep 0' 10202 1727204046.29847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204046.29897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204046.29933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204046.30000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.32197: stdout chunk (state=3): >>>ansible-tmp-1727204046.2915313-10796-87766365300317=/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317 <<< 10202 1727204046.32396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204046.32418: stderr chunk (state=3): >>><<< 10202 1727204046.32435: stdout chunk (state=3): >>><<< 10202 1727204046.32467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204046.2915313-10796-87766365300317=/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204046.32521: variable 'ansible_module_compression' from source: unknown 10202 1727204046.32603: ANSIBALLZ: Using generic lock for ansible.legacy.command 10202 1727204046.32612: ANSIBALLZ: Acquiring lock 10202 1727204046.32672: ANSIBALLZ: Lock acquired: 140045305564624 10202 1727204046.32676: ANSIBALLZ: Creating module 10202 1727204046.57806: ANSIBALLZ: Writing module into payload 10202 1727204046.58123: ANSIBALLZ: Writing module 10202 1727204046.58148: ANSIBALLZ: Renaming module 10202 1727204046.58154: ANSIBALLZ: Done creating module 10202 1727204046.58383: variable 'ansible_facts' from source: unknown 10202 1727204046.58525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py 10202 1727204046.59162: Sending initial data 10202 1727204046.59169: Sent initial data (155 bytes) 10202 1727204046.60696: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204046.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204046.60703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204046.60706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204046.60772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204046.60849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204046.60929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.62740: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204046.62962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204046.63040: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpurrc3946 /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py <<< 10202 1727204046.63045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py" <<< 10202 1727204046.63099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpurrc3946" to remote "/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py" <<< 10202 1727204046.64671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204046.64889: stderr chunk (state=3): >>><<< 10202 1727204046.64893: stdout chunk (state=3): >>><<< 10202 1727204046.64919: done transferring module to remote 10202 1727204046.64932: _low_level_execute_command(): starting 10202 1727204046.64936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/ /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py && sleep 0' 10202 1727204046.66994: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204046.67563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204046.67819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204046.67856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204046.70023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204046.70029: stdout chunk (state=3): >>><<< 10202 1727204046.70032: stderr chunk (state=3): >>><<< 10202 1727204046.70092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204046.70096: _low_level_execute_command(): starting 10202 1727204046.70099: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/AnsiballZ_command.py && sleep 0' 10202 1727204046.71843: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204046.72125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204046.72491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.19954: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 670 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 670 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! <<< 10202 1727204048.19962: stdout chunk (state=3): >>>firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:06.897129", "end": "2024-09-24 14:54:08.197819", "delta": "0:00:01.300690", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204048.21837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204048.21842: stdout chunk (state=3): >>><<< 10202 1727204048.21845: stderr chunk (state=3): >>><<< 10202 1727204048.21871: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 670 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 670 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:06.897129", "end": "2024-09-24 14:54:08.197819", "delta": "0:00:01.300690", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204048.22120: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204048.22123: _low_level_execute_command(): starting 10202 1727204048.22126: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204046.2915313-10796-87766365300317/ > /dev/null 2>&1 && sleep 0' 10202 1727204048.23107: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204048.23160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204048.23283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204048.23303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204048.23322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.23341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.23455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.25874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.25878: stdout chunk (state=3): >>><<< 10202 1727204048.25881: stderr chunk (state=3): >>><<< 10202 1727204048.25883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.25886: handler run complete 10202 1727204048.25892: Evaluated conditional (False): False 10202 1727204048.25895: attempt loop complete, returning result 10202 1727204048.25897: _execute() done 10202 1727204048.25899: dumping result to json 10202 1727204048.25906: done dumping result, returning 10202 1727204048.25909: done running TaskExecutor() for managed-node3/TASK: Create test interfaces [127b8e07-fff9-0b04-2570-000000000012] 10202 1727204048.25911: sending task result for task 127b8e07-fff9-0b04-2570-000000000012 ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.300690", "end": "2024-09-24 14:54:08.197819", "rc": 0, "start": "2024-09-24 14:54:06.897129" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 670 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 670 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + grep -q 'inet [1-9]' + ip addr show testbr + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 10202 1727204048.26442: no more pending results, returning what we have 10202 1727204048.26446: results queue empty 10202 1727204048.26447: checking for any_errors_fatal 10202 1727204048.26460: done checking for any_errors_fatal 10202 1727204048.26464: checking for max_fail_percentage 10202 1727204048.26468: done checking for max_fail_percentage 10202 1727204048.26469: checking to see if all hosts have failed and the running result is not ok 10202 1727204048.26471: done checking to see if all hosts have failed 10202 1727204048.26475: getting the remaining hosts for this loop 10202 1727204048.26477: done getting the remaining hosts for this loop 10202 1727204048.26640: getting the next task for host managed-node3 10202 1727204048.26655: done getting next task for host managed-node3 10202 1727204048.26659: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10202 1727204048.26662: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204048.26667: getting variables 10202 1727204048.26669: in VariableManager get_vars() 10202 1727204048.26976: Calling all_inventory to load vars for managed-node3 10202 1727204048.26979: Calling groups_inventory to load vars for managed-node3 10202 1727204048.26981: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.26999: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.27003: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.27007: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.27276: done sending task result for task 127b8e07-fff9-0b04-2570-000000000012 10202 1727204048.27280: WORKER PROCESS EXITING 10202 1727204048.27314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.28016: done with get_vars() 10202 1727204048.28172: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:02.052) 0:00:09.960 ***** 10202 1727204048.28483: entering _queue_task() for managed-node3/include_tasks 10202 1727204048.29147: worker is 1 (out of 1 available) 10202 1727204048.29163: exiting _queue_task() for managed-node3/include_tasks 10202 1727204048.29493: done queuing things up, now waiting for results queue to drain 10202 1727204048.29495: waiting for pending results... 10202 1727204048.29822: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10202 1727204048.30147: in run() - task 127b8e07-fff9-0b04-2570-000000000016 10202 1727204048.30160: variable 'ansible_search_path' from source: unknown 10202 1727204048.30164: variable 'ansible_search_path' from source: unknown 10202 1727204048.30206: calling self._execute() 10202 1727204048.30299: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.30304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.30315: variable 'omit' from source: magic vars 10202 1727204048.30700: variable 'ansible_distribution_major_version' from source: facts 10202 1727204048.30713: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204048.30720: _execute() done 10202 1727204048.30723: dumping result to json 10202 1727204048.30730: done dumping result, returning 10202 1727204048.30734: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-0b04-2570-000000000016] 10202 1727204048.30743: sending task result for task 127b8e07-fff9-0b04-2570-000000000016 10202 1727204048.30850: done sending task result for task 127b8e07-fff9-0b04-2570-000000000016 10202 1727204048.30853: WORKER PROCESS EXITING 10202 1727204048.30897: no more pending results, returning what we have 10202 1727204048.30902: in VariableManager get_vars() 10202 1727204048.30952: Calling all_inventory to load vars for managed-node3 10202 1727204048.30955: Calling groups_inventory to load vars for managed-node3 10202 1727204048.30957: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.30977: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.30981: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.30984: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.31239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.31464: done with get_vars() 10202 1727204048.31475: variable 'ansible_search_path' from source: unknown 10202 1727204048.31477: variable 'ansible_search_path' from source: unknown 10202 1727204048.31531: we have included files to process 10202 1727204048.31533: generating all_blocks data 10202 1727204048.31534: done generating all_blocks data 10202 1727204048.31535: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.31536: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.31539: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.31818: done processing included file 10202 1727204048.31820: iterating over new_blocks loaded from include file 10202 1727204048.31822: in VariableManager get_vars() 10202 1727204048.31852: done with get_vars() 10202 1727204048.31854: filtering new block on tags 10202 1727204048.31875: done filtering new block on tags 10202 1727204048.31877: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10202 1727204048.31883: extending task lists for all hosts with included blocks 10202 1727204048.32005: done extending task lists 10202 1727204048.32006: done processing included files 10202 1727204048.32007: results queue empty 10202 1727204048.32008: checking for any_errors_fatal 10202 1727204048.32015: done checking for any_errors_fatal 10202 1727204048.32016: checking for max_fail_percentage 10202 1727204048.32017: done checking for max_fail_percentage 10202 1727204048.32018: checking to see if all hosts have failed and the running result is not ok 10202 1727204048.32019: done checking to see if all hosts have failed 10202 1727204048.32020: getting the remaining hosts for this loop 10202 1727204048.32021: done getting the remaining hosts for this loop 10202 1727204048.32024: getting the next task for host managed-node3 10202 1727204048.32035: done getting next task for host managed-node3 10202 1727204048.32038: ^ task is: TASK: Get stat for interface {{ interface }} 10202 1727204048.32041: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204048.32043: getting variables 10202 1727204048.32044: in VariableManager get_vars() 10202 1727204048.32064: Calling all_inventory to load vars for managed-node3 10202 1727204048.32068: Calling groups_inventory to load vars for managed-node3 10202 1727204048.32070: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.32076: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.32079: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.32082: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.32245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.32482: done with get_vars() 10202 1727204048.32496: done getting variables 10202 1727204048.32708: variable 'interface' from source: task vars 10202 1727204048.32714: variable 'dhcp_interface1' from source: play vars 10202 1727204048.32804: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.044) 0:00:10.004 ***** 10202 1727204048.32897: entering _queue_task() for managed-node3/stat 10202 1727204048.33811: worker is 1 (out of 1 available) 10202 1727204048.33823: exiting _queue_task() for managed-node3/stat 10202 1727204048.33836: done queuing things up, now waiting for results queue to drain 10202 1727204048.33838: waiting for pending results... 10202 1727204048.34405: running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 10202 1727204048.34454: in run() - task 127b8e07-fff9-0b04-2570-000000000152 10202 1727204048.34557: variable 'ansible_search_path' from source: unknown 10202 1727204048.34664: variable 'ansible_search_path' from source: unknown 10202 1727204048.34673: calling self._execute() 10202 1727204048.34814: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.34822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.34833: variable 'omit' from source: magic vars 10202 1727204048.35836: variable 'ansible_distribution_major_version' from source: facts 10202 1727204048.35846: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204048.35854: variable 'omit' from source: magic vars 10202 1727204048.35919: variable 'omit' from source: magic vars 10202 1727204048.36334: variable 'interface' from source: task vars 10202 1727204048.36340: variable 'dhcp_interface1' from source: play vars 10202 1727204048.36412: variable 'dhcp_interface1' from source: play vars 10202 1727204048.36432: variable 'omit' from source: magic vars 10202 1727204048.36580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204048.36617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204048.36637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204048.36655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.36670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.36703: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204048.36706: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.36709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.37134: Set connection var ansible_shell_type to sh 10202 1727204048.37140: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204048.37147: Set connection var ansible_connection to ssh 10202 1727204048.37153: Set connection var ansible_shell_executable to /bin/sh 10202 1727204048.37160: Set connection var ansible_pipelining to False 10202 1727204048.37170: Set connection var ansible_timeout to 10 10202 1727204048.37216: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.37219: variable 'ansible_connection' from source: unknown 10202 1727204048.37222: variable 'ansible_module_compression' from source: unknown 10202 1727204048.37224: variable 'ansible_shell_type' from source: unknown 10202 1727204048.37227: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.37231: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.37233: variable 'ansible_pipelining' from source: unknown 10202 1727204048.37237: variable 'ansible_timeout' from source: unknown 10202 1727204048.37239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.37592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204048.37673: variable 'omit' from source: magic vars 10202 1727204048.37676: starting attempt loop 10202 1727204048.37679: running the handler 10202 1727204048.37681: _low_level_execute_command(): starting 10202 1727204048.37683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204048.38484: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204048.38562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204048.38590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.38690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.38816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.40690: stdout chunk (state=3): >>>/root <<< 10202 1727204048.40789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.41172: stderr chunk (state=3): >>><<< 10202 1727204048.41176: stdout chunk (state=3): >>><<< 10202 1727204048.41179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.41182: _low_level_execute_command(): starting 10202 1727204048.41185: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630 `" && echo ansible-tmp-1727204048.4112468-10860-247662130929630="` echo /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630 `" ) && sleep 0' 10202 1727204048.42635: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204048.42654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204048.42679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204048.42782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.42983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.43170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.45359: stdout chunk (state=3): >>>ansible-tmp-1727204048.4112468-10860-247662130929630=/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630 <<< 10202 1727204048.45560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.45577: stdout chunk (state=3): >>><<< 10202 1727204048.45591: stderr chunk (state=3): >>><<< 10202 1727204048.45617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.4112468-10860-247662130929630=/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.45786: variable 'ansible_module_compression' from source: unknown 10202 1727204048.45857: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204048.45971: variable 'ansible_facts' from source: unknown 10202 1727204048.46004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py 10202 1727204048.46189: Sending initial data 10202 1727204048.46203: Sent initial data (153 bytes) 10202 1727204048.46878: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204048.46929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204048.46943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204048.46955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204048.47040: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.47290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.47420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.49364: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204048.49421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204048.49503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpq3uhe3g6 /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py <<< 10202 1727204048.49507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py" <<< 10202 1727204048.49601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpq3uhe3g6" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py" <<< 10202 1727204048.51374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.51378: stderr chunk (state=3): >>><<< 10202 1727204048.51381: stdout chunk (state=3): >>><<< 10202 1727204048.51383: done transferring module to remote 10202 1727204048.51385: _low_level_execute_command(): starting 10202 1727204048.51387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/ /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py && sleep 0' 10202 1727204048.52390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204048.52691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.52705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.52864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.55001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.55294: stderr chunk (state=3): >>><<< 10202 1727204048.55298: stdout chunk (state=3): >>><<< 10202 1727204048.55314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.55331: _low_level_execute_command(): starting 10202 1727204048.55342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/AnsiballZ_stat.py && sleep 0' 10202 1727204048.56753: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204048.56763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204048.56784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204048.56800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204048.56813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204048.56896: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204048.56918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204048.56933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.56953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.57139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.75139: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35182, "dev": 23, "nlink": 1, "atime": 1727204046.9048102, "mtime": 1727204046.9048102, "ctime": 1727204046.9048102, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204048.76814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204048.76852: stderr chunk (state=3): >>><<< 10202 1727204048.76856: stdout chunk (state=3): >>><<< 10202 1727204048.77039: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35182, "dev": 23, "nlink": 1, "atime": 1727204046.9048102, "mtime": 1727204046.9048102, "ctime": 1727204046.9048102, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204048.77043: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204048.77046: _low_level_execute_command(): starting 10202 1727204048.77048: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.4112468-10860-247662130929630/ > /dev/null 2>&1 && sleep 0' 10202 1727204048.77670: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204048.77747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204048.77831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204048.77938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.78095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.80242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.80247: stdout chunk (state=3): >>><<< 10202 1727204048.80250: stderr chunk (state=3): >>><<< 10202 1727204048.80370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.80376: handler run complete 10202 1727204048.80378: attempt loop complete, returning result 10202 1727204048.80380: _execute() done 10202 1727204048.80383: dumping result to json 10202 1727204048.80386: done dumping result, returning 10202 1727204048.80389: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 [127b8e07-fff9-0b04-2570-000000000152] 10202 1727204048.80391: sending task result for task 127b8e07-fff9-0b04-2570-000000000152 10202 1727204048.80490: done sending task result for task 127b8e07-fff9-0b04-2570-000000000152 10202 1727204048.80493: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204046.9048102, "block_size": 4096, "blocks": 0, "ctime": 1727204046.9048102, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35182, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204046.9048102, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10202 1727204048.80763: no more pending results, returning what we have 10202 1727204048.80769: results queue empty 10202 1727204048.80770: checking for any_errors_fatal 10202 1727204048.80772: done checking for any_errors_fatal 10202 1727204048.80773: checking for max_fail_percentage 10202 1727204048.80774: done checking for max_fail_percentage 10202 1727204048.80775: checking to see if all hosts have failed and the running result is not ok 10202 1727204048.80776: done checking to see if all hosts have failed 10202 1727204048.80777: getting the remaining hosts for this loop 10202 1727204048.80779: done getting the remaining hosts for this loop 10202 1727204048.80783: getting the next task for host managed-node3 10202 1727204048.80790: done getting next task for host managed-node3 10202 1727204048.80792: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10202 1727204048.80795: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204048.80799: getting variables 10202 1727204048.80800: in VariableManager get_vars() 10202 1727204048.80831: Calling all_inventory to load vars for managed-node3 10202 1727204048.80834: Calling groups_inventory to load vars for managed-node3 10202 1727204048.80837: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.80848: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.80851: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.80854: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.81026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.81240: done with get_vars() 10202 1727204048.81253: done getting variables 10202 1727204048.81362: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10202 1727204048.81490: variable 'interface' from source: task vars 10202 1727204048.81495: variable 'dhcp_interface1' from source: play vars 10202 1727204048.81558: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.486) 0:00:10.491 ***** 10202 1727204048.81594: entering _queue_task() for managed-node3/assert 10202 1727204048.81596: Creating lock for assert 10202 1727204048.81953: worker is 1 (out of 1 available) 10202 1727204048.82171: exiting _queue_task() for managed-node3/assert 10202 1727204048.82183: done queuing things up, now waiting for results queue to drain 10202 1727204048.82185: waiting for pending results... 10202 1727204048.82618: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' 10202 1727204048.82630: in run() - task 127b8e07-fff9-0b04-2570-000000000017 10202 1727204048.82634: variable 'ansible_search_path' from source: unknown 10202 1727204048.82637: variable 'ansible_search_path' from source: unknown 10202 1727204048.82639: calling self._execute() 10202 1727204048.82641: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.82644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.82646: variable 'omit' from source: magic vars 10202 1727204048.82973: variable 'ansible_distribution_major_version' from source: facts 10202 1727204048.82995: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204048.83010: variable 'omit' from source: magic vars 10202 1727204048.83173: variable 'omit' from source: magic vars 10202 1727204048.83186: variable 'interface' from source: task vars 10202 1727204048.83195: variable 'dhcp_interface1' from source: play vars 10202 1727204048.83264: variable 'dhcp_interface1' from source: play vars 10202 1727204048.83295: variable 'omit' from source: magic vars 10202 1727204048.83340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204048.83386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204048.83418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204048.83445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.83462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.83506: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204048.83570: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.83573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.83641: Set connection var ansible_shell_type to sh 10202 1727204048.83654: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204048.83664: Set connection var ansible_connection to ssh 10202 1727204048.83677: Set connection var ansible_shell_executable to /bin/sh 10202 1727204048.83688: Set connection var ansible_pipelining to False 10202 1727204048.83698: Set connection var ansible_timeout to 10 10202 1727204048.83735: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.83743: variable 'ansible_connection' from source: unknown 10202 1727204048.83750: variable 'ansible_module_compression' from source: unknown 10202 1727204048.83755: variable 'ansible_shell_type' from source: unknown 10202 1727204048.83761: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.83826: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.83830: variable 'ansible_pipelining' from source: unknown 10202 1727204048.83833: variable 'ansible_timeout' from source: unknown 10202 1727204048.83836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.84086: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204048.84415: variable 'omit' from source: magic vars 10202 1727204048.84418: starting attempt loop 10202 1727204048.84420: running the handler 10202 1727204048.84773: variable 'interface_stat' from source: set_fact 10202 1727204048.84777: Evaluated conditional (interface_stat.stat.exists): True 10202 1727204048.84779: handler run complete 10202 1727204048.84781: attempt loop complete, returning result 10202 1727204048.84783: _execute() done 10202 1727204048.84785: dumping result to json 10202 1727204048.84787: done dumping result, returning 10202 1727204048.84796: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' [127b8e07-fff9-0b04-2570-000000000017] 10202 1727204048.84806: sending task result for task 127b8e07-fff9-0b04-2570-000000000017 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204048.84971: no more pending results, returning what we have 10202 1727204048.84975: results queue empty 10202 1727204048.84976: checking for any_errors_fatal 10202 1727204048.84986: done checking for any_errors_fatal 10202 1727204048.84987: checking for max_fail_percentage 10202 1727204048.84988: done checking for max_fail_percentage 10202 1727204048.84989: checking to see if all hosts have failed and the running result is not ok 10202 1727204048.84991: done checking to see if all hosts have failed 10202 1727204048.84992: getting the remaining hosts for this loop 10202 1727204048.84994: done getting the remaining hosts for this loop 10202 1727204048.84998: getting the next task for host managed-node3 10202 1727204048.85008: done getting next task for host managed-node3 10202 1727204048.85011: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10202 1727204048.85014: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204048.85018: getting variables 10202 1727204048.85020: in VariableManager get_vars() 10202 1727204048.85069: Calling all_inventory to load vars for managed-node3 10202 1727204048.85072: Calling groups_inventory to load vars for managed-node3 10202 1727204048.85075: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.85089: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.85092: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.85095: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.86001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.86222: done with get_vars() 10202 1727204048.86239: done getting variables 10202 1727204048.86479: done sending task result for task 127b8e07-fff9-0b04-2570-000000000017 10202 1727204048.86483: WORKER PROCESS EXITING TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.049) 0:00:10.541 ***** 10202 1727204048.86569: entering _queue_task() for managed-node3/include_tasks 10202 1727204048.87354: worker is 1 (out of 1 available) 10202 1727204048.87371: exiting _queue_task() for managed-node3/include_tasks 10202 1727204048.87384: done queuing things up, now waiting for results queue to drain 10202 1727204048.87385: waiting for pending results... 10202 1727204048.87672: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10202 1727204048.88043: in run() - task 127b8e07-fff9-0b04-2570-00000000001b 10202 1727204048.88057: variable 'ansible_search_path' from source: unknown 10202 1727204048.88061: variable 'ansible_search_path' from source: unknown 10202 1727204048.88104: calling self._execute() 10202 1727204048.88388: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.88403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.88411: variable 'omit' from source: magic vars 10202 1727204048.89364: variable 'ansible_distribution_major_version' from source: facts 10202 1727204048.89379: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204048.89387: _execute() done 10202 1727204048.89506: dumping result to json 10202 1727204048.89510: done dumping result, returning 10202 1727204048.89587: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-0b04-2570-00000000001b] 10202 1727204048.89594: sending task result for task 127b8e07-fff9-0b04-2570-00000000001b 10202 1727204048.89868: done sending task result for task 127b8e07-fff9-0b04-2570-00000000001b 10202 1727204048.89905: no more pending results, returning what we have 10202 1727204048.89910: in VariableManager get_vars() 10202 1727204048.89964: Calling all_inventory to load vars for managed-node3 10202 1727204048.89970: Calling groups_inventory to load vars for managed-node3 10202 1727204048.89972: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.89987: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.89990: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.89993: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.90638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.91237: done with get_vars() 10202 1727204048.91250: variable 'ansible_search_path' from source: unknown 10202 1727204048.91252: variable 'ansible_search_path' from source: unknown 10202 1727204048.91286: WORKER PROCESS EXITING 10202 1727204048.91500: we have included files to process 10202 1727204048.91502: generating all_blocks data 10202 1727204048.91504: done generating all_blocks data 10202 1727204048.91509: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.91511: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.91514: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204048.91816: done processing included file 10202 1727204048.91819: iterating over new_blocks loaded from include file 10202 1727204048.91821: in VariableManager get_vars() 10202 1727204048.91849: done with get_vars() 10202 1727204048.91852: filtering new block on tags 10202 1727204048.91879: done filtering new block on tags 10202 1727204048.91882: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10202 1727204048.91888: extending task lists for all hosts with included blocks 10202 1727204048.92075: done extending task lists 10202 1727204048.92077: done processing included files 10202 1727204048.92078: results queue empty 10202 1727204048.92079: checking for any_errors_fatal 10202 1727204048.92087: done checking for any_errors_fatal 10202 1727204048.92088: checking for max_fail_percentage 10202 1727204048.92089: done checking for max_fail_percentage 10202 1727204048.92090: checking to see if all hosts have failed and the running result is not ok 10202 1727204048.92091: done checking to see if all hosts have failed 10202 1727204048.92092: getting the remaining hosts for this loop 10202 1727204048.92093: done getting the remaining hosts for this loop 10202 1727204048.92096: getting the next task for host managed-node3 10202 1727204048.92100: done getting next task for host managed-node3 10202 1727204048.92102: ^ task is: TASK: Get stat for interface {{ interface }} 10202 1727204048.92105: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204048.92107: getting variables 10202 1727204048.92108: in VariableManager get_vars() 10202 1727204048.92123: Calling all_inventory to load vars for managed-node3 10202 1727204048.92126: Calling groups_inventory to load vars for managed-node3 10202 1727204048.92130: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204048.92137: Calling all_plugins_play to load vars for managed-node3 10202 1727204048.92140: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204048.92143: Calling groups_plugins_play to load vars for managed-node3 10202 1727204048.92391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204048.92730: done with get_vars() 10202 1727204048.92742: done getting variables 10202 1727204048.92923: variable 'interface' from source: task vars 10202 1727204048.92927: variable 'dhcp_interface2' from source: play vars 10202 1727204048.92990: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.064) 0:00:10.605 ***** 10202 1727204048.93271: entering _queue_task() for managed-node3/stat 10202 1727204048.93815: worker is 1 (out of 1 available) 10202 1727204048.93832: exiting _queue_task() for managed-node3/stat 10202 1727204048.93846: done queuing things up, now waiting for results queue to drain 10202 1727204048.93847: waiting for pending results... 10202 1727204048.94233: running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 10202 1727204048.94450: in run() - task 127b8e07-fff9-0b04-2570-00000000016a 10202 1727204048.94502: variable 'ansible_search_path' from source: unknown 10202 1727204048.94510: variable 'ansible_search_path' from source: unknown 10202 1727204048.94559: calling self._execute() 10202 1727204048.94668: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.94683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.94703: variable 'omit' from source: magic vars 10202 1727204048.95111: variable 'ansible_distribution_major_version' from source: facts 10202 1727204048.95137: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204048.95150: variable 'omit' from source: magic vars 10202 1727204048.95214: variable 'omit' from source: magic vars 10202 1727204048.95321: variable 'interface' from source: task vars 10202 1727204048.95333: variable 'dhcp_interface2' from source: play vars 10202 1727204048.95406: variable 'dhcp_interface2' from source: play vars 10202 1727204048.95435: variable 'omit' from source: magic vars 10202 1727204048.95491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204048.95538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204048.95568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204048.95594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.95612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204048.95651: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204048.95660: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.95672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.95786: Set connection var ansible_shell_type to sh 10202 1727204048.95800: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204048.95811: Set connection var ansible_connection to ssh 10202 1727204048.95820: Set connection var ansible_shell_executable to /bin/sh 10202 1727204048.95833: Set connection var ansible_pipelining to False 10202 1727204048.95843: Set connection var ansible_timeout to 10 10202 1727204048.95876: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.95884: variable 'ansible_connection' from source: unknown 10202 1727204048.95890: variable 'ansible_module_compression' from source: unknown 10202 1727204048.95898: variable 'ansible_shell_type' from source: unknown 10202 1727204048.95906: variable 'ansible_shell_executable' from source: unknown 10202 1727204048.95912: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204048.95920: variable 'ansible_pipelining' from source: unknown 10202 1727204048.95929: variable 'ansible_timeout' from source: unknown 10202 1727204048.95938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204048.96162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204048.96270: variable 'omit' from source: magic vars 10202 1727204048.96273: starting attempt loop 10202 1727204048.96276: running the handler 10202 1727204048.96278: _low_level_execute_command(): starting 10202 1727204048.96280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204048.97061: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204048.97279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204048.97315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204048.97415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204048.99282: stdout chunk (state=3): >>>/root <<< 10202 1727204048.99496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204048.99500: stdout chunk (state=3): >>><<< 10202 1727204048.99502: stderr chunk (state=3): >>><<< 10202 1727204048.99646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204048.99649: _low_level_execute_command(): starting 10202 1727204048.99653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095 `" && echo ansible-tmp-1727204048.9953213-10897-3164000145095="` echo /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095 `" ) && sleep 0' 10202 1727204049.00323: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.00368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.00450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.02670: stdout chunk (state=3): >>>ansible-tmp-1727204048.9953213-10897-3164000145095=/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095 <<< 10202 1727204049.03077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.03081: stdout chunk (state=3): >>><<< 10202 1727204049.03083: stderr chunk (state=3): >>><<< 10202 1727204049.03085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.9953213-10897-3164000145095=/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.03088: variable 'ansible_module_compression' from source: unknown 10202 1727204049.03090: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204049.03092: variable 'ansible_facts' from source: unknown 10202 1727204049.03192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py 10202 1727204049.03434: Sending initial data 10202 1727204049.03437: Sent initial data (151 bytes) 10202 1727204049.04152: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204049.04212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204049.04230: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204049.04284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.04348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.04387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.04433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.04530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.06347: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204049.06430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204049.06503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp0gqvk_dl /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py <<< 10202 1727204049.06507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py" <<< 10202 1727204049.06610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp0gqvk_dl" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py" <<< 10202 1727204049.07539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.07604: stderr chunk (state=3): >>><<< 10202 1727204049.07614: stdout chunk (state=3): >>><<< 10202 1727204049.07660: done transferring module to remote 10202 1727204049.07683: _low_level_execute_command(): starting 10202 1727204049.07692: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/ /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py && sleep 0' 10202 1727204049.08562: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.08607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.08630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.08696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.08792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.10886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.10949: stderr chunk (state=3): >>><<< 10202 1727204049.10953: stdout chunk (state=3): >>><<< 10202 1727204049.10970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.11007: _low_level_execute_command(): starting 10202 1727204049.11010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/AnsiballZ_stat.py && sleep 0' 10202 1727204049.11695: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204049.11768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.11808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.11894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.31199: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35588, "dev": 23, "nlink": 1, "atime": 1727204046.9116619, "mtime": 1727204046.9116619, "ctime": 1727204046.9116619, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204049.32897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204049.32902: stdout chunk (state=3): >>><<< 10202 1727204049.32905: stderr chunk (state=3): >>><<< 10202 1727204049.32950: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35588, "dev": 23, "nlink": 1, "atime": 1727204046.9116619, "mtime": 1727204046.9116619, "ctime": 1727204046.9116619, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204049.33005: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204049.33011: _low_level_execute_command(): starting 10202 1727204049.33019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.9953213-10897-3164000145095/ > /dev/null 2>&1 && sleep 0' 10202 1727204049.33525: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.33532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.33535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.33537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.33600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.33603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.33703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.36368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.36422: stderr chunk (state=3): >>><<< 10202 1727204049.36426: stdout chunk (state=3): >>><<< 10202 1727204049.36429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.36438: handler run complete 10202 1727204049.36503: attempt loop complete, returning result 10202 1727204049.36511: _execute() done 10202 1727204049.36514: dumping result to json 10202 1727204049.36516: done dumping result, returning 10202 1727204049.36519: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 [127b8e07-fff9-0b04-2570-00000000016a] 10202 1727204049.36540: sending task result for task 127b8e07-fff9-0b04-2570-00000000016a 10202 1727204049.36656: done sending task result for task 127b8e07-fff9-0b04-2570-00000000016a 10202 1727204049.36659: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204046.9116619, "block_size": 4096, "blocks": 0, "ctime": 1727204046.9116619, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35588, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204046.9116619, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10202 1727204049.36753: no more pending results, returning what we have 10202 1727204049.36756: results queue empty 10202 1727204049.36757: checking for any_errors_fatal 10202 1727204049.36759: done checking for any_errors_fatal 10202 1727204049.36759: checking for max_fail_percentage 10202 1727204049.36761: done checking for max_fail_percentage 10202 1727204049.36762: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.36763: done checking to see if all hosts have failed 10202 1727204049.36764: getting the remaining hosts for this loop 10202 1727204049.36767: done getting the remaining hosts for this loop 10202 1727204049.36772: getting the next task for host managed-node3 10202 1727204049.36780: done getting next task for host managed-node3 10202 1727204049.36783: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10202 1727204049.36785: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.36790: getting variables 10202 1727204049.36792: in VariableManager get_vars() 10202 1727204049.36838: Calling all_inventory to load vars for managed-node3 10202 1727204049.36841: Calling groups_inventory to load vars for managed-node3 10202 1727204049.36843: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.36854: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.36856: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.36859: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.37004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.37130: done with get_vars() 10202 1727204049.37139: done getting variables 10202 1727204049.37188: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204049.37285: variable 'interface' from source: task vars 10202 1727204049.37288: variable 'dhcp_interface2' from source: play vars 10202 1727204049.37335: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.443) 0:00:11.048 ***** 10202 1727204049.37359: entering _queue_task() for managed-node3/assert 10202 1727204049.37612: worker is 1 (out of 1 available) 10202 1727204049.37627: exiting _queue_task() for managed-node3/assert 10202 1727204049.37640: done queuing things up, now waiting for results queue to drain 10202 1727204049.37641: waiting for pending results... 10202 1727204049.37812: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' 10202 1727204049.37891: in run() - task 127b8e07-fff9-0b04-2570-00000000001c 10202 1727204049.37902: variable 'ansible_search_path' from source: unknown 10202 1727204049.37906: variable 'ansible_search_path' from source: unknown 10202 1727204049.37940: calling self._execute() 10202 1727204049.38295: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.38301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.38312: variable 'omit' from source: magic vars 10202 1727204049.38585: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.38596: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.38602: variable 'omit' from source: magic vars 10202 1727204049.38642: variable 'omit' from source: magic vars 10202 1727204049.38717: variable 'interface' from source: task vars 10202 1727204049.38721: variable 'dhcp_interface2' from source: play vars 10202 1727204049.38776: variable 'dhcp_interface2' from source: play vars 10202 1727204049.38790: variable 'omit' from source: magic vars 10202 1727204049.38826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204049.38864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204049.38883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204049.38898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.38907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.38935: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204049.38938: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.38941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.39019: Set connection var ansible_shell_type to sh 10202 1727204049.39023: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204049.39029: Set connection var ansible_connection to ssh 10202 1727204049.39037: Set connection var ansible_shell_executable to /bin/sh 10202 1727204049.39042: Set connection var ansible_pipelining to False 10202 1727204049.39048: Set connection var ansible_timeout to 10 10202 1727204049.39074: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.39079: variable 'ansible_connection' from source: unknown 10202 1727204049.39082: variable 'ansible_module_compression' from source: unknown 10202 1727204049.39085: variable 'ansible_shell_type' from source: unknown 10202 1727204049.39087: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.39089: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.39092: variable 'ansible_pipelining' from source: unknown 10202 1727204049.39094: variable 'ansible_timeout' from source: unknown 10202 1727204049.39097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.39214: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204049.39219: variable 'omit' from source: magic vars 10202 1727204049.39225: starting attempt loop 10202 1727204049.39228: running the handler 10202 1727204049.39330: variable 'interface_stat' from source: set_fact 10202 1727204049.39347: Evaluated conditional (interface_stat.stat.exists): True 10202 1727204049.39353: handler run complete 10202 1727204049.39364: attempt loop complete, returning result 10202 1727204049.39368: _execute() done 10202 1727204049.39371: dumping result to json 10202 1727204049.39374: done dumping result, returning 10202 1727204049.39382: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' [127b8e07-fff9-0b04-2570-00000000001c] 10202 1727204049.39389: sending task result for task 127b8e07-fff9-0b04-2570-00000000001c 10202 1727204049.39477: done sending task result for task 127b8e07-fff9-0b04-2570-00000000001c 10202 1727204049.39480: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204049.39886: no more pending results, returning what we have 10202 1727204049.39893: results queue empty 10202 1727204049.39894: checking for any_errors_fatal 10202 1727204049.39904: done checking for any_errors_fatal 10202 1727204049.39905: checking for max_fail_percentage 10202 1727204049.39906: done checking for max_fail_percentage 10202 1727204049.39907: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.39908: done checking to see if all hosts have failed 10202 1727204049.39909: getting the remaining hosts for this loop 10202 1727204049.39910: done getting the remaining hosts for this loop 10202 1727204049.39914: getting the next task for host managed-node3 10202 1727204049.39920: done getting next task for host managed-node3 10202 1727204049.39923: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 10202 1727204049.39925: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.39927: getting variables 10202 1727204049.39928: in VariableManager get_vars() 10202 1727204049.39961: Calling all_inventory to load vars for managed-node3 10202 1727204049.39964: Calling groups_inventory to load vars for managed-node3 10202 1727204049.39970: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.39981: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.39983: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.39986: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.40171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.40391: done with get_vars() 10202 1727204049.40403: done getting variables 10202 1727204049.40486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.031) 0:00:11.080 ***** 10202 1727204049.40516: entering _queue_task() for managed-node3/command 10202 1727204049.40987: worker is 1 (out of 1 available) 10202 1727204049.41000: exiting _queue_task() for managed-node3/command 10202 1727204049.41010: done queuing things up, now waiting for results queue to drain 10202 1727204049.41012: waiting for pending results... 10202 1727204049.41222: running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript 10202 1727204049.41302: in run() - task 127b8e07-fff9-0b04-2570-00000000001d 10202 1727204049.41315: variable 'ansible_search_path' from source: unknown 10202 1727204049.41349: calling self._execute() 10202 1727204049.41421: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.41430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.41437: variable 'omit' from source: magic vars 10202 1727204049.41748: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.41759: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.41848: variable 'network_provider' from source: set_fact 10202 1727204049.41853: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204049.41856: when evaluation is False, skipping this task 10202 1727204049.41860: _execute() done 10202 1727204049.41865: dumping result to json 10202 1727204049.41869: done dumping result, returning 10202 1727204049.41877: done running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript [127b8e07-fff9-0b04-2570-00000000001d] 10202 1727204049.41882: sending task result for task 127b8e07-fff9-0b04-2570-00000000001d 10202 1727204049.41983: done sending task result for task 127b8e07-fff9-0b04-2570-00000000001d 10202 1727204049.41986: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10202 1727204049.42049: no more pending results, returning what we have 10202 1727204049.42052: results queue empty 10202 1727204049.42053: checking for any_errors_fatal 10202 1727204049.42060: done checking for any_errors_fatal 10202 1727204049.42061: checking for max_fail_percentage 10202 1727204049.42063: done checking for max_fail_percentage 10202 1727204049.42064: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.42065: done checking to see if all hosts have failed 10202 1727204049.42067: getting the remaining hosts for this loop 10202 1727204049.42069: done getting the remaining hosts for this loop 10202 1727204049.42073: getting the next task for host managed-node3 10202 1727204049.42078: done getting next task for host managed-node3 10202 1727204049.42081: ^ task is: TASK: TEST Add Bond with 2 ports 10202 1727204049.42083: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.42086: getting variables 10202 1727204049.42087: in VariableManager get_vars() 10202 1727204049.42130: Calling all_inventory to load vars for managed-node3 10202 1727204049.42133: Calling groups_inventory to load vars for managed-node3 10202 1727204049.42135: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.42146: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.42149: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.42151: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.42299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.42452: done with get_vars() 10202 1727204049.42461: done getting variables 10202 1727204049.42511: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.020) 0:00:11.100 ***** 10202 1727204049.42535: entering _queue_task() for managed-node3/debug 10202 1727204049.42771: worker is 1 (out of 1 available) 10202 1727204049.42786: exiting _queue_task() for managed-node3/debug 10202 1727204049.42799: done queuing things up, now waiting for results queue to drain 10202 1727204049.42801: waiting for pending results... 10202 1727204049.42977: running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports 10202 1727204049.43043: in run() - task 127b8e07-fff9-0b04-2570-00000000001e 10202 1727204049.43055: variable 'ansible_search_path' from source: unknown 10202 1727204049.43089: calling self._execute() 10202 1727204049.43162: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.43170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.43180: variable 'omit' from source: magic vars 10202 1727204049.43474: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.43485: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.43497: variable 'omit' from source: magic vars 10202 1727204049.43511: variable 'omit' from source: magic vars 10202 1727204049.43541: variable 'omit' from source: magic vars 10202 1727204049.43578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204049.43611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204049.43627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204049.43644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.43654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.43682: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204049.43686: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.43688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.43798: Set connection var ansible_shell_type to sh 10202 1727204049.43813: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204049.43818: Set connection var ansible_connection to ssh 10202 1727204049.43820: Set connection var ansible_shell_executable to /bin/sh 10202 1727204049.43847: Set connection var ansible_pipelining to False 10202 1727204049.43851: Set connection var ansible_timeout to 10 10202 1727204049.43861: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.43864: variable 'ansible_connection' from source: unknown 10202 1727204049.43868: variable 'ansible_module_compression' from source: unknown 10202 1727204049.43871: variable 'ansible_shell_type' from source: unknown 10202 1727204049.43873: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.43877: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.43901: variable 'ansible_pipelining' from source: unknown 10202 1727204049.43904: variable 'ansible_timeout' from source: unknown 10202 1727204049.43906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.44275: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204049.44278: variable 'omit' from source: magic vars 10202 1727204049.44281: starting attempt loop 10202 1727204049.44283: running the handler 10202 1727204049.44285: handler run complete 10202 1727204049.44287: attempt loop complete, returning result 10202 1727204049.44288: _execute() done 10202 1727204049.44290: dumping result to json 10202 1727204049.44292: done dumping result, returning 10202 1727204049.44293: done running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports [127b8e07-fff9-0b04-2570-00000000001e] 10202 1727204049.44295: sending task result for task 127b8e07-fff9-0b04-2570-00000000001e 10202 1727204049.44360: done sending task result for task 127b8e07-fff9-0b04-2570-00000000001e 10202 1727204049.44363: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 10202 1727204049.44406: no more pending results, returning what we have 10202 1727204049.44410: results queue empty 10202 1727204049.44411: checking for any_errors_fatal 10202 1727204049.44416: done checking for any_errors_fatal 10202 1727204049.44417: checking for max_fail_percentage 10202 1727204049.44418: done checking for max_fail_percentage 10202 1727204049.44419: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.44420: done checking to see if all hosts have failed 10202 1727204049.44421: getting the remaining hosts for this loop 10202 1727204049.44422: done getting the remaining hosts for this loop 10202 1727204049.44425: getting the next task for host managed-node3 10202 1727204049.44432: done getting next task for host managed-node3 10202 1727204049.44437: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10202 1727204049.44439: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.44454: getting variables 10202 1727204049.44455: in VariableManager get_vars() 10202 1727204049.44508: Calling all_inventory to load vars for managed-node3 10202 1727204049.44511: Calling groups_inventory to load vars for managed-node3 10202 1727204049.44514: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.44524: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.44528: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.44531: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.44729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.44939: done with get_vars() 10202 1727204049.44955: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.025) 0:00:11.125 ***** 10202 1727204049.45059: entering _queue_task() for managed-node3/include_tasks 10202 1727204049.45392: worker is 1 (out of 1 available) 10202 1727204049.45406: exiting _queue_task() for managed-node3/include_tasks 10202 1727204049.45422: done queuing things up, now waiting for results queue to drain 10202 1727204049.45424: waiting for pending results... 10202 1727204049.45818: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10202 1727204049.45880: in run() - task 127b8e07-fff9-0b04-2570-000000000026 10202 1727204049.45884: variable 'ansible_search_path' from source: unknown 10202 1727204049.45887: variable 'ansible_search_path' from source: unknown 10202 1727204049.45920: calling self._execute() 10202 1727204049.46109: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.46118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.46131: variable 'omit' from source: magic vars 10202 1727204049.46410: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.46423: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.46428: _execute() done 10202 1727204049.46435: dumping result to json 10202 1727204049.46438: done dumping result, returning 10202 1727204049.46447: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-0b04-2570-000000000026] 10202 1727204049.46450: sending task result for task 127b8e07-fff9-0b04-2570-000000000026 10202 1727204049.46542: done sending task result for task 127b8e07-fff9-0b04-2570-000000000026 10202 1727204049.46545: WORKER PROCESS EXITING 10202 1727204049.46604: no more pending results, returning what we have 10202 1727204049.46608: in VariableManager get_vars() 10202 1727204049.46655: Calling all_inventory to load vars for managed-node3 10202 1727204049.46660: Calling groups_inventory to load vars for managed-node3 10202 1727204049.46662: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.46673: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.46676: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.46679: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.46867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.46994: done with get_vars() 10202 1727204049.47001: variable 'ansible_search_path' from source: unknown 10202 1727204049.47001: variable 'ansible_search_path' from source: unknown 10202 1727204049.47032: we have included files to process 10202 1727204049.47033: generating all_blocks data 10202 1727204049.47034: done generating all_blocks data 10202 1727204049.47037: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204049.47038: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204049.47039: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204049.47561: done processing included file 10202 1727204049.47563: iterating over new_blocks loaded from include file 10202 1727204049.47564: in VariableManager get_vars() 10202 1727204049.47585: done with get_vars() 10202 1727204049.47587: filtering new block on tags 10202 1727204049.47599: done filtering new block on tags 10202 1727204049.47601: in VariableManager get_vars() 10202 1727204049.47617: done with get_vars() 10202 1727204049.47618: filtering new block on tags 10202 1727204049.47637: done filtering new block on tags 10202 1727204049.47639: in VariableManager get_vars() 10202 1727204049.47655: done with get_vars() 10202 1727204049.47656: filtering new block on tags 10202 1727204049.47669: done filtering new block on tags 10202 1727204049.47670: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 10202 1727204049.47675: extending task lists for all hosts with included blocks 10202 1727204049.48225: done extending task lists 10202 1727204049.48229: done processing included files 10202 1727204049.48230: results queue empty 10202 1727204049.48230: checking for any_errors_fatal 10202 1727204049.48232: done checking for any_errors_fatal 10202 1727204049.48233: checking for max_fail_percentage 10202 1727204049.48233: done checking for max_fail_percentage 10202 1727204049.48234: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.48235: done checking to see if all hosts have failed 10202 1727204049.48235: getting the remaining hosts for this loop 10202 1727204049.48236: done getting the remaining hosts for this loop 10202 1727204049.48238: getting the next task for host managed-node3 10202 1727204049.48241: done getting next task for host managed-node3 10202 1727204049.48243: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10202 1727204049.48246: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.48253: getting variables 10202 1727204049.48254: in VariableManager get_vars() 10202 1727204049.48268: Calling all_inventory to load vars for managed-node3 10202 1727204049.48270: Calling groups_inventory to load vars for managed-node3 10202 1727204049.48273: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.48278: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.48280: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.48282: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.48402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.48570: done with get_vars() 10202 1727204049.48580: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.035) 0:00:11.161 ***** 10202 1727204049.48660: entering _queue_task() for managed-node3/setup 10202 1727204049.49016: worker is 1 (out of 1 available) 10202 1727204049.49031: exiting _queue_task() for managed-node3/setup 10202 1727204049.49047: done queuing things up, now waiting for results queue to drain 10202 1727204049.49048: waiting for pending results... 10202 1727204049.49388: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10202 1727204049.49514: in run() - task 127b8e07-fff9-0b04-2570-000000000188 10202 1727204049.49544: variable 'ansible_search_path' from source: unknown 10202 1727204049.49549: variable 'ansible_search_path' from source: unknown 10202 1727204049.49595: calling self._execute() 10202 1727204049.49661: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.49669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.49678: variable 'omit' from source: magic vars 10202 1727204049.49988: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.49997: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.50171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204049.51831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204049.51885: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204049.51914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204049.51954: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204049.51977: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204049.52046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204049.52072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204049.52090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204049.52120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204049.52133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204049.52180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204049.52197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204049.52214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204049.52246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204049.52257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204049.52383: variable '__network_required_facts' from source: role '' defaults 10202 1727204049.52395: variable 'ansible_facts' from source: unknown 10202 1727204049.52458: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10202 1727204049.52462: when evaluation is False, skipping this task 10202 1727204049.52464: _execute() done 10202 1727204049.52469: dumping result to json 10202 1727204049.52471: done dumping result, returning 10202 1727204049.52479: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-0b04-2570-000000000188] 10202 1727204049.52486: sending task result for task 127b8e07-fff9-0b04-2570-000000000188 10202 1727204049.52583: done sending task result for task 127b8e07-fff9-0b04-2570-000000000188 10202 1727204049.52585: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204049.52631: no more pending results, returning what we have 10202 1727204049.52635: results queue empty 10202 1727204049.52635: checking for any_errors_fatal 10202 1727204049.52637: done checking for any_errors_fatal 10202 1727204049.52637: checking for max_fail_percentage 10202 1727204049.52639: done checking for max_fail_percentage 10202 1727204049.52640: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.52641: done checking to see if all hosts have failed 10202 1727204049.52641: getting the remaining hosts for this loop 10202 1727204049.52643: done getting the remaining hosts for this loop 10202 1727204049.52648: getting the next task for host managed-node3 10202 1727204049.52657: done getting next task for host managed-node3 10202 1727204049.52661: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10202 1727204049.52667: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.52681: getting variables 10202 1727204049.52682: in VariableManager get_vars() 10202 1727204049.52723: Calling all_inventory to load vars for managed-node3 10202 1727204049.52726: Calling groups_inventory to load vars for managed-node3 10202 1727204049.52728: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.52738: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.52741: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.52744: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.52909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.53068: done with get_vars() 10202 1727204049.53078: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.044) 0:00:11.206 ***** 10202 1727204049.53160: entering _queue_task() for managed-node3/stat 10202 1727204049.53394: worker is 1 (out of 1 available) 10202 1727204049.53409: exiting _queue_task() for managed-node3/stat 10202 1727204049.53421: done queuing things up, now waiting for results queue to drain 10202 1727204049.53423: waiting for pending results... 10202 1727204049.53600: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 10202 1727204049.53701: in run() - task 127b8e07-fff9-0b04-2570-00000000018a 10202 1727204049.53713: variable 'ansible_search_path' from source: unknown 10202 1727204049.53717: variable 'ansible_search_path' from source: unknown 10202 1727204049.53748: calling self._execute() 10202 1727204049.53819: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.53824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.53834: variable 'omit' from source: magic vars 10202 1727204049.54134: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.54145: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.54277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204049.54491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204049.54534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204049.54564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204049.54591: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204049.54661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204049.54684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204049.54703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204049.54722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204049.54797: variable '__network_is_ostree' from source: set_fact 10202 1727204049.54804: Evaluated conditional (not __network_is_ostree is defined): False 10202 1727204049.54807: when evaluation is False, skipping this task 10202 1727204049.54810: _execute() done 10202 1727204049.54814: dumping result to json 10202 1727204049.54819: done dumping result, returning 10202 1727204049.54829: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-0b04-2570-00000000018a] 10202 1727204049.54832: sending task result for task 127b8e07-fff9-0b04-2570-00000000018a 10202 1727204049.54923: done sending task result for task 127b8e07-fff9-0b04-2570-00000000018a 10202 1727204049.54926: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10202 1727204049.55010: no more pending results, returning what we have 10202 1727204049.55013: results queue empty 10202 1727204049.55014: checking for any_errors_fatal 10202 1727204049.55023: done checking for any_errors_fatal 10202 1727204049.55024: checking for max_fail_percentage 10202 1727204049.55025: done checking for max_fail_percentage 10202 1727204049.55026: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.55030: done checking to see if all hosts have failed 10202 1727204049.55030: getting the remaining hosts for this loop 10202 1727204049.55032: done getting the remaining hosts for this loop 10202 1727204049.55037: getting the next task for host managed-node3 10202 1727204049.55051: done getting next task for host managed-node3 10202 1727204049.55055: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10202 1727204049.55060: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.55076: getting variables 10202 1727204049.55077: in VariableManager get_vars() 10202 1727204049.55115: Calling all_inventory to load vars for managed-node3 10202 1727204049.55118: Calling groups_inventory to load vars for managed-node3 10202 1727204049.55120: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.55132: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.55135: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.55138: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.55276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.55410: done with get_vars() 10202 1727204049.55420: done getting variables 10202 1727204049.55470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.023) 0:00:11.230 ***** 10202 1727204049.55501: entering _queue_task() for managed-node3/set_fact 10202 1727204049.55758: worker is 1 (out of 1 available) 10202 1727204049.55775: exiting _queue_task() for managed-node3/set_fact 10202 1727204049.55789: done queuing things up, now waiting for results queue to drain 10202 1727204049.55791: waiting for pending results... 10202 1727204049.55970: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10202 1727204049.56078: in run() - task 127b8e07-fff9-0b04-2570-00000000018b 10202 1727204049.56092: variable 'ansible_search_path' from source: unknown 10202 1727204049.56095: variable 'ansible_search_path' from source: unknown 10202 1727204049.56130: calling self._execute() 10202 1727204049.56196: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.56202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.56210: variable 'omit' from source: magic vars 10202 1727204049.56504: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.56514: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.56644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204049.56922: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204049.56960: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204049.56988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204049.57018: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204049.57088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204049.57107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204049.57133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204049.57151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204049.57232: variable '__network_is_ostree' from source: set_fact 10202 1727204049.57236: Evaluated conditional (not __network_is_ostree is defined): False 10202 1727204049.57239: when evaluation is False, skipping this task 10202 1727204049.57241: _execute() done 10202 1727204049.57246: dumping result to json 10202 1727204049.57248: done dumping result, returning 10202 1727204049.57254: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-0b04-2570-00000000018b] 10202 1727204049.57259: sending task result for task 127b8e07-fff9-0b04-2570-00000000018b 10202 1727204049.57354: done sending task result for task 127b8e07-fff9-0b04-2570-00000000018b 10202 1727204049.57358: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10202 1727204049.57404: no more pending results, returning what we have 10202 1727204049.57407: results queue empty 10202 1727204049.57408: checking for any_errors_fatal 10202 1727204049.57413: done checking for any_errors_fatal 10202 1727204049.57414: checking for max_fail_percentage 10202 1727204049.57415: done checking for max_fail_percentage 10202 1727204049.57417: checking to see if all hosts have failed and the running result is not ok 10202 1727204049.57418: done checking to see if all hosts have failed 10202 1727204049.57418: getting the remaining hosts for this loop 10202 1727204049.57420: done getting the remaining hosts for this loop 10202 1727204049.57424: getting the next task for host managed-node3 10202 1727204049.57436: done getting next task for host managed-node3 10202 1727204049.57440: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10202 1727204049.57445: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204049.57459: getting variables 10202 1727204049.57460: in VariableManager get_vars() 10202 1727204049.57510: Calling all_inventory to load vars for managed-node3 10202 1727204049.57513: Calling groups_inventory to load vars for managed-node3 10202 1727204049.57515: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204049.57524: Calling all_plugins_play to load vars for managed-node3 10202 1727204049.57529: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204049.57533: Calling groups_plugins_play to load vars for managed-node3 10202 1727204049.57722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204049.57857: done with get_vars() 10202 1727204049.57869: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.024) 0:00:11.254 ***** 10202 1727204049.57951: entering _queue_task() for managed-node3/service_facts 10202 1727204049.57952: Creating lock for service_facts 10202 1727204049.58240: worker is 1 (out of 1 available) 10202 1727204049.58255: exiting _queue_task() for managed-node3/service_facts 10202 1727204049.58271: done queuing things up, now waiting for results queue to drain 10202 1727204049.58272: waiting for pending results... 10202 1727204049.58448: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 10202 1727204049.58545: in run() - task 127b8e07-fff9-0b04-2570-00000000018d 10202 1727204049.58558: variable 'ansible_search_path' from source: unknown 10202 1727204049.58562: variable 'ansible_search_path' from source: unknown 10202 1727204049.58596: calling self._execute() 10202 1727204049.58667: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.58673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.58681: variable 'omit' from source: magic vars 10202 1727204049.58976: variable 'ansible_distribution_major_version' from source: facts 10202 1727204049.58986: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204049.58993: variable 'omit' from source: magic vars 10202 1727204049.59048: variable 'omit' from source: magic vars 10202 1727204049.59075: variable 'omit' from source: magic vars 10202 1727204049.59109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204049.59140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204049.59160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204049.59177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.59187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204049.59213: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204049.59216: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.59219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.59298: Set connection var ansible_shell_type to sh 10202 1727204049.59304: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204049.59310: Set connection var ansible_connection to ssh 10202 1727204049.59315: Set connection var ansible_shell_executable to /bin/sh 10202 1727204049.59321: Set connection var ansible_pipelining to False 10202 1727204049.59326: Set connection var ansible_timeout to 10 10202 1727204049.59346: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.59349: variable 'ansible_connection' from source: unknown 10202 1727204049.59352: variable 'ansible_module_compression' from source: unknown 10202 1727204049.59355: variable 'ansible_shell_type' from source: unknown 10202 1727204049.59357: variable 'ansible_shell_executable' from source: unknown 10202 1727204049.59360: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204049.59362: variable 'ansible_pipelining' from source: unknown 10202 1727204049.59372: variable 'ansible_timeout' from source: unknown 10202 1727204049.59375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204049.59535: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204049.59544: variable 'omit' from source: magic vars 10202 1727204049.59549: starting attempt loop 10202 1727204049.59552: running the handler 10202 1727204049.59567: _low_level_execute_command(): starting 10202 1727204049.59574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204049.60140: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.60145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204049.60150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.60208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.60212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.60215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.60303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.62143: stdout chunk (state=3): >>>/root <<< 10202 1727204049.62256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.62323: stderr chunk (state=3): >>><<< 10202 1727204049.62326: stdout chunk (state=3): >>><<< 10202 1727204049.62354: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.62367: _low_level_execute_command(): starting 10202 1727204049.62375: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224 `" && echo ansible-tmp-1727204049.6235344-10926-150437595945224="` echo /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224 `" ) && sleep 0' 10202 1727204049.62858: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.62863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 10202 1727204049.62878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.62907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204049.62911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.62970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.62974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.62976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.63057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.65263: stdout chunk (state=3): >>>ansible-tmp-1727204049.6235344-10926-150437595945224=/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224 <<< 10202 1727204049.65379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.65445: stderr chunk (state=3): >>><<< 10202 1727204049.65449: stdout chunk (state=3): >>><<< 10202 1727204049.65464: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204049.6235344-10926-150437595945224=/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.65514: variable 'ansible_module_compression' from source: unknown 10202 1727204049.65557: ANSIBALLZ: Using lock for service_facts 10202 1727204049.65561: ANSIBALLZ: Acquiring lock 10202 1727204049.65563: ANSIBALLZ: Lock acquired: 140045305436144 10202 1727204049.65568: ANSIBALLZ: Creating module 10202 1727204049.78773: ANSIBALLZ: Writing module into payload 10202 1727204049.78778: ANSIBALLZ: Writing module 10202 1727204049.78781: ANSIBALLZ: Renaming module 10202 1727204049.78783: ANSIBALLZ: Done creating module 10202 1727204049.78791: variable 'ansible_facts' from source: unknown 10202 1727204049.78871: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py 10202 1727204049.79143: Sending initial data 10202 1727204049.79153: Sent initial data (162 bytes) 10202 1727204049.80594: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.80633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.80647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.80743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.82541: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10202 1727204049.82562: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10202 1727204049.82607: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204049.82704: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204049.82789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpfs0b3qny /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py <<< 10202 1727204049.82798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py" <<< 10202 1727204049.82849: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpfs0b3qny" to remote "/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py" <<< 10202 1727204049.84713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.84718: stdout chunk (state=3): >>><<< 10202 1727204049.84721: stderr chunk (state=3): >>><<< 10202 1727204049.84723: done transferring module to remote 10202 1727204049.84725: _low_level_execute_command(): starting 10202 1727204049.84731: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/ /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py && sleep 0' 10202 1727204049.85412: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.85417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.85420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204049.85426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204049.85432: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204049.85472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204049.85485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.85581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204049.88012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204049.88016: stdout chunk (state=3): >>><<< 10202 1727204049.88018: stderr chunk (state=3): >>><<< 10202 1727204049.88040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204049.88049: _low_level_execute_command(): starting 10202 1727204049.88059: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/AnsiballZ_service_facts.py && sleep 0' 10202 1727204049.89492: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204049.89588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204049.89648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204049.89739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.32332: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 10202 1727204052.32355: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-upd<<< 10202 1727204052.32389: stdout chunk (state=3): >>>ate-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.ser<<< 10202 1727204052.32393: stdout chunk (state=3): >>>vice": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10202 1727204052.34223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204052.34289: stderr chunk (state=3): >>><<< 10202 1727204052.34292: stdout chunk (state=3): >>><<< 10202 1727204052.34316: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204052.35770: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204052.35778: _low_level_execute_command(): starting 10202 1727204052.35783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204049.6235344-10926-150437595945224/ > /dev/null 2>&1 && sleep 0' 10202 1727204052.36290: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204052.36295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.36297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204052.36299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.36350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204052.36353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.36438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.45390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204052.45453: stderr chunk (state=3): >>><<< 10202 1727204052.45457: stdout chunk (state=3): >>><<< 10202 1727204052.45472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204052.45479: handler run complete 10202 1727204052.45622: variable 'ansible_facts' from source: unknown 10202 1727204052.45744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204052.46076: variable 'ansible_facts' from source: unknown 10202 1727204052.46174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204052.46321: attempt loop complete, returning result 10202 1727204052.46324: _execute() done 10202 1727204052.46332: dumping result to json 10202 1727204052.46374: done dumping result, returning 10202 1727204052.46384: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-0b04-2570-00000000018d] 10202 1727204052.46390: sending task result for task 127b8e07-fff9-0b04-2570-00000000018d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204052.47058: no more pending results, returning what we have 10202 1727204052.47062: results queue empty 10202 1727204052.47062: checking for any_errors_fatal 10202 1727204052.47068: done checking for any_errors_fatal 10202 1727204052.47069: checking for max_fail_percentage 10202 1727204052.47070: done checking for max_fail_percentage 10202 1727204052.47071: checking to see if all hosts have failed and the running result is not ok 10202 1727204052.47072: done checking to see if all hosts have failed 10202 1727204052.47073: getting the remaining hosts for this loop 10202 1727204052.47074: done getting the remaining hosts for this loop 10202 1727204052.47078: getting the next task for host managed-node3 10202 1727204052.47083: done getting next task for host managed-node3 10202 1727204052.47090: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10202 1727204052.47093: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204052.47103: done sending task result for task 127b8e07-fff9-0b04-2570-00000000018d 10202 1727204052.47106: WORKER PROCESS EXITING 10202 1727204052.47112: getting variables 10202 1727204052.47113: in VariableManager get_vars() 10202 1727204052.47141: Calling all_inventory to load vars for managed-node3 10202 1727204052.47143: Calling groups_inventory to load vars for managed-node3 10202 1727204052.47144: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204052.47152: Calling all_plugins_play to load vars for managed-node3 10202 1727204052.47153: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204052.47155: Calling groups_plugins_play to load vars for managed-node3 10202 1727204052.47517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204052.47849: done with get_vars() 10202 1727204052.47860: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:12 -0400 (0:00:02.899) 0:00:14.154 ***** 10202 1727204052.47935: entering _queue_task() for managed-node3/package_facts 10202 1727204052.47940: Creating lock for package_facts 10202 1727204052.48193: worker is 1 (out of 1 available) 10202 1727204052.48207: exiting _queue_task() for managed-node3/package_facts 10202 1727204052.48220: done queuing things up, now waiting for results queue to drain 10202 1727204052.48222: waiting for pending results... 10202 1727204052.48402: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 10202 1727204052.48511: in run() - task 127b8e07-fff9-0b04-2570-00000000018e 10202 1727204052.48523: variable 'ansible_search_path' from source: unknown 10202 1727204052.48526: variable 'ansible_search_path' from source: unknown 10202 1727204052.48564: calling self._execute() 10202 1727204052.48635: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204052.48640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204052.48649: variable 'omit' from source: magic vars 10202 1727204052.48954: variable 'ansible_distribution_major_version' from source: facts 10202 1727204052.48964: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204052.48972: variable 'omit' from source: magic vars 10202 1727204052.49029: variable 'omit' from source: magic vars 10202 1727204052.49057: variable 'omit' from source: magic vars 10202 1727204052.49105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204052.49139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204052.49154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204052.49171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204052.49181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204052.49207: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204052.49211: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204052.49216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204052.49292: Set connection var ansible_shell_type to sh 10202 1727204052.49297: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204052.49303: Set connection var ansible_connection to ssh 10202 1727204052.49308: Set connection var ansible_shell_executable to /bin/sh 10202 1727204052.49315: Set connection var ansible_pipelining to False 10202 1727204052.49321: Set connection var ansible_timeout to 10 10202 1727204052.49345: variable 'ansible_shell_executable' from source: unknown 10202 1727204052.49349: variable 'ansible_connection' from source: unknown 10202 1727204052.49352: variable 'ansible_module_compression' from source: unknown 10202 1727204052.49354: variable 'ansible_shell_type' from source: unknown 10202 1727204052.49357: variable 'ansible_shell_executable' from source: unknown 10202 1727204052.49360: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204052.49362: variable 'ansible_pipelining' from source: unknown 10202 1727204052.49364: variable 'ansible_timeout' from source: unknown 10202 1727204052.49370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204052.49535: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204052.49545: variable 'omit' from source: magic vars 10202 1727204052.49548: starting attempt loop 10202 1727204052.49551: running the handler 10202 1727204052.49568: _low_level_execute_command(): starting 10202 1727204052.49574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204052.50138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204052.50143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204052.50148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.50199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204052.50202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204052.50208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.50279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.52111: stdout chunk (state=3): >>>/root <<< 10202 1727204052.52272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204052.52278: stderr chunk (state=3): >>><<< 10202 1727204052.52284: stdout chunk (state=3): >>><<< 10202 1727204052.52305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204052.52316: _low_level_execute_command(): starting 10202 1727204052.52323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697 `" && echo ansible-tmp-1727204052.5230381-11264-242349903196697="` echo /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697 `" ) && sleep 0' 10202 1727204052.52853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.52856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.52909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204052.52916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204052.52918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.52988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.55188: stdout chunk (state=3): >>>ansible-tmp-1727204052.5230381-11264-242349903196697=/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697 <<< 10202 1727204052.55268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204052.55334: stderr chunk (state=3): >>><<< 10202 1727204052.55338: stdout chunk (state=3): >>><<< 10202 1727204052.55359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.5230381-11264-242349903196697=/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204052.55420: variable 'ansible_module_compression' from source: unknown 10202 1727204052.55462: ANSIBALLZ: Using lock for package_facts 10202 1727204052.55467: ANSIBALLZ: Acquiring lock 10202 1727204052.55470: ANSIBALLZ: Lock acquired: 140045310184864 10202 1727204052.55473: ANSIBALLZ: Creating module 10202 1727204052.90173: ANSIBALLZ: Writing module into payload 10202 1727204052.90235: ANSIBALLZ: Writing module 10202 1727204052.90276: ANSIBALLZ: Renaming module 10202 1727204052.90284: ANSIBALLZ: Done creating module 10202 1727204052.90334: variable 'ansible_facts' from source: unknown 10202 1727204052.90501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py 10202 1727204052.90668: Sending initial data 10202 1727204052.90672: Sent initial data (162 bytes) 10202 1727204052.91433: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.91497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204052.91512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204052.91539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.91656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.93477: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204052.93554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204052.93634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpbm0mohix /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py <<< 10202 1727204052.93638: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py" <<< 10202 1727204052.93703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpbm0mohix" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py" <<< 10202 1727204052.95372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204052.95430: stderr chunk (state=3): >>><<< 10202 1727204052.95434: stdout chunk (state=3): >>><<< 10202 1727204052.95457: done transferring module to remote 10202 1727204052.95473: _low_level_execute_command(): starting 10202 1727204052.95477: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/ /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py && sleep 0' 10202 1727204052.96244: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204052.96249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.96251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204052.96270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204052.96276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204052.96291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.96410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.96482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204052.98774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204052.98778: stderr chunk (state=3): >>><<< 10202 1727204052.98781: stdout chunk (state=3): >>><<< 10202 1727204052.98784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204052.98786: _low_level_execute_command(): starting 10202 1727204052.98789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/AnsiballZ_package_facts.py && sleep 0' 10202 1727204052.99390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204052.99397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204052.99409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204052.99424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204052.99437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204052.99496: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204052.99564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204052.99597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204052.99614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204052.99726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204053.64762: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 10202 1727204053.64896: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 10202 1727204053.64905: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 10202 1727204053.64912: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 10202 1727204053.64920: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 10202 1727204053.64940: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10202 1727204053.67140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204053.67145: stdout chunk (state=3): >>><<< 10202 1727204053.67147: stderr chunk (state=3): >>><<< 10202 1727204053.67382: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204053.70761: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204053.70807: _low_level_execute_command(): starting 10202 1727204053.70819: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.5230381-11264-242349903196697/ > /dev/null 2>&1 && sleep 0' 10202 1727204053.71584: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204053.71690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204053.71716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204053.71744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204053.71765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204053.71872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204053.73993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204053.74072: stderr chunk (state=3): >>><<< 10202 1727204053.74077: stdout chunk (state=3): >>><<< 10202 1727204053.74080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204053.74083: handler run complete 10202 1727204053.74905: variable 'ansible_facts' from source: unknown 10202 1727204053.75446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.77088: variable 'ansible_facts' from source: unknown 10202 1727204053.77416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.78274: attempt loop complete, returning result 10202 1727204053.78278: _execute() done 10202 1727204053.78280: dumping result to json 10202 1727204053.78494: done dumping result, returning 10202 1727204053.78505: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-0b04-2570-00000000018e] 10202 1727204053.78511: sending task result for task 127b8e07-fff9-0b04-2570-00000000018e 10202 1727204053.80764: done sending task result for task 127b8e07-fff9-0b04-2570-00000000018e 10202 1727204053.80769: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204053.80817: no more pending results, returning what we have 10202 1727204053.80820: results queue empty 10202 1727204053.80820: checking for any_errors_fatal 10202 1727204053.80824: done checking for any_errors_fatal 10202 1727204053.80825: checking for max_fail_percentage 10202 1727204053.80826: done checking for max_fail_percentage 10202 1727204053.80826: checking to see if all hosts have failed and the running result is not ok 10202 1727204053.80829: done checking to see if all hosts have failed 10202 1727204053.80829: getting the remaining hosts for this loop 10202 1727204053.80830: done getting the remaining hosts for this loop 10202 1727204053.80833: getting the next task for host managed-node3 10202 1727204053.80839: done getting next task for host managed-node3 10202 1727204053.80841: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10202 1727204053.80843: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204053.80851: getting variables 10202 1727204053.80852: in VariableManager get_vars() 10202 1727204053.80881: Calling all_inventory to load vars for managed-node3 10202 1727204053.80883: Calling groups_inventory to load vars for managed-node3 10202 1727204053.80884: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204053.80891: Calling all_plugins_play to load vars for managed-node3 10202 1727204053.80893: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204053.80895: Calling groups_plugins_play to load vars for managed-node3 10202 1727204053.82355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.83940: done with get_vars() 10202 1727204053.83970: done getting variables 10202 1727204053.84022: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:13 -0400 (0:00:01.361) 0:00:15.515 ***** 10202 1727204053.84050: entering _queue_task() for managed-node3/debug 10202 1727204053.84322: worker is 1 (out of 1 available) 10202 1727204053.84342: exiting _queue_task() for managed-node3/debug 10202 1727204053.84354: done queuing things up, now waiting for results queue to drain 10202 1727204053.84355: waiting for pending results... 10202 1727204053.84536: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 10202 1727204053.84632: in run() - task 127b8e07-fff9-0b04-2570-000000000027 10202 1727204053.84642: variable 'ansible_search_path' from source: unknown 10202 1727204053.84646: variable 'ansible_search_path' from source: unknown 10202 1727204053.84679: calling self._execute() 10202 1727204053.84750: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.84754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.84764: variable 'omit' from source: magic vars 10202 1727204053.85271: variable 'ansible_distribution_major_version' from source: facts 10202 1727204053.85275: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204053.85278: variable 'omit' from source: magic vars 10202 1727204053.85281: variable 'omit' from source: magic vars 10202 1727204053.85283: variable 'network_provider' from source: set_fact 10202 1727204053.85301: variable 'omit' from source: magic vars 10202 1727204053.85351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204053.85398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204053.85427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204053.85453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204053.85473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204053.85515: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204053.85525: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.85533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.85642: Set connection var ansible_shell_type to sh 10202 1727204053.85654: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204053.85664: Set connection var ansible_connection to ssh 10202 1727204053.85677: Set connection var ansible_shell_executable to /bin/sh 10202 1727204053.85688: Set connection var ansible_pipelining to False 10202 1727204053.85699: Set connection var ansible_timeout to 10 10202 1727204053.85729: variable 'ansible_shell_executable' from source: unknown 10202 1727204053.85737: variable 'ansible_connection' from source: unknown 10202 1727204053.85744: variable 'ansible_module_compression' from source: unknown 10202 1727204053.85750: variable 'ansible_shell_type' from source: unknown 10202 1727204053.85756: variable 'ansible_shell_executable' from source: unknown 10202 1727204053.85762: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.85778: variable 'ansible_pipelining' from source: unknown 10202 1727204053.85791: variable 'ansible_timeout' from source: unknown 10202 1727204053.85800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.85953: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204053.85974: variable 'omit' from source: magic vars 10202 1727204053.85985: starting attempt loop 10202 1727204053.85993: running the handler 10202 1727204053.86046: handler run complete 10202 1727204053.86071: attempt loop complete, returning result 10202 1727204053.86078: _execute() done 10202 1727204053.86085: dumping result to json 10202 1727204053.86093: done dumping result, returning 10202 1727204053.86106: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-0b04-2570-000000000027] 10202 1727204053.86117: sending task result for task 127b8e07-fff9-0b04-2570-000000000027 ok: [managed-node3] => {} MSG: Using network provider: nm 10202 1727204053.86397: no more pending results, returning what we have 10202 1727204053.86400: results queue empty 10202 1727204053.86401: checking for any_errors_fatal 10202 1727204053.86409: done checking for any_errors_fatal 10202 1727204053.86410: checking for max_fail_percentage 10202 1727204053.86412: done checking for max_fail_percentage 10202 1727204053.86413: checking to see if all hosts have failed and the running result is not ok 10202 1727204053.86414: done checking to see if all hosts have failed 10202 1727204053.86414: getting the remaining hosts for this loop 10202 1727204053.86416: done getting the remaining hosts for this loop 10202 1727204053.86419: getting the next task for host managed-node3 10202 1727204053.86424: done getting next task for host managed-node3 10202 1727204053.86431: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10202 1727204053.86433: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204053.86443: getting variables 10202 1727204053.86444: in VariableManager get_vars() 10202 1727204053.86487: Calling all_inventory to load vars for managed-node3 10202 1727204053.86490: Calling groups_inventory to load vars for managed-node3 10202 1727204053.86498: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204053.86507: Calling all_plugins_play to load vars for managed-node3 10202 1727204053.86509: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204053.86512: Calling groups_plugins_play to load vars for managed-node3 10202 1727204053.87033: done sending task result for task 127b8e07-fff9-0b04-2570-000000000027 10202 1727204053.87039: WORKER PROCESS EXITING 10202 1727204053.87503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.88693: done with get_vars() 10202 1727204053.88720: done getting variables 10202 1727204053.88803: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.047) 0:00:15.563 ***** 10202 1727204053.88834: entering _queue_task() for managed-node3/fail 10202 1727204053.88835: Creating lock for fail 10202 1727204053.89111: worker is 1 (out of 1 available) 10202 1727204053.89127: exiting _queue_task() for managed-node3/fail 10202 1727204053.89142: done queuing things up, now waiting for results queue to drain 10202 1727204053.89144: waiting for pending results... 10202 1727204053.89323: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10202 1727204053.89414: in run() - task 127b8e07-fff9-0b04-2570-000000000028 10202 1727204053.89426: variable 'ansible_search_path' from source: unknown 10202 1727204053.89432: variable 'ansible_search_path' from source: unknown 10202 1727204053.89463: calling self._execute() 10202 1727204053.89536: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.89540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.89549: variable 'omit' from source: magic vars 10202 1727204053.89846: variable 'ansible_distribution_major_version' from source: facts 10202 1727204053.89856: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204053.89947: variable 'network_state' from source: role '' defaults 10202 1727204053.89957: Evaluated conditional (network_state != {}): False 10202 1727204053.89960: when evaluation is False, skipping this task 10202 1727204053.89963: _execute() done 10202 1727204053.89968: dumping result to json 10202 1727204053.89970: done dumping result, returning 10202 1727204053.89978: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-0b04-2570-000000000028] 10202 1727204053.89984: sending task result for task 127b8e07-fff9-0b04-2570-000000000028 10202 1727204053.90082: done sending task result for task 127b8e07-fff9-0b04-2570-000000000028 10202 1727204053.90084: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204053.90138: no more pending results, returning what we have 10202 1727204053.90142: results queue empty 10202 1727204053.90143: checking for any_errors_fatal 10202 1727204053.90150: done checking for any_errors_fatal 10202 1727204053.90150: checking for max_fail_percentage 10202 1727204053.90152: done checking for max_fail_percentage 10202 1727204053.90153: checking to see if all hosts have failed and the running result is not ok 10202 1727204053.90154: done checking to see if all hosts have failed 10202 1727204053.90154: getting the remaining hosts for this loop 10202 1727204053.90156: done getting the remaining hosts for this loop 10202 1727204053.90161: getting the next task for host managed-node3 10202 1727204053.90169: done getting next task for host managed-node3 10202 1727204053.90173: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10202 1727204053.90176: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204053.90195: getting variables 10202 1727204053.90196: in VariableManager get_vars() 10202 1727204053.90236: Calling all_inventory to load vars for managed-node3 10202 1727204053.90239: Calling groups_inventory to load vars for managed-node3 10202 1727204053.90241: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204053.90251: Calling all_plugins_play to load vars for managed-node3 10202 1727204053.90253: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204053.90256: Calling groups_plugins_play to load vars for managed-node3 10202 1727204053.91320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.92491: done with get_vars() 10202 1727204053.92516: done getting variables 10202 1727204053.92574: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.037) 0:00:15.601 ***** 10202 1727204053.92602: entering _queue_task() for managed-node3/fail 10202 1727204053.92881: worker is 1 (out of 1 available) 10202 1727204053.92895: exiting _queue_task() for managed-node3/fail 10202 1727204053.92909: done queuing things up, now waiting for results queue to drain 10202 1727204053.92910: waiting for pending results... 10202 1727204053.93089: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10202 1727204053.93178: in run() - task 127b8e07-fff9-0b04-2570-000000000029 10202 1727204053.93190: variable 'ansible_search_path' from source: unknown 10202 1727204053.93193: variable 'ansible_search_path' from source: unknown 10202 1727204053.93227: calling self._execute() 10202 1727204053.93298: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.93303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.93312: variable 'omit' from source: magic vars 10202 1727204053.93601: variable 'ansible_distribution_major_version' from source: facts 10202 1727204053.93612: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204053.93700: variable 'network_state' from source: role '' defaults 10202 1727204053.93709: Evaluated conditional (network_state != {}): False 10202 1727204053.93712: when evaluation is False, skipping this task 10202 1727204053.93715: _execute() done 10202 1727204053.93718: dumping result to json 10202 1727204053.93720: done dumping result, returning 10202 1727204053.93730: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-0b04-2570-000000000029] 10202 1727204053.93734: sending task result for task 127b8e07-fff9-0b04-2570-000000000029 10202 1727204053.93830: done sending task result for task 127b8e07-fff9-0b04-2570-000000000029 10202 1727204053.93833: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204053.93885: no more pending results, returning what we have 10202 1727204053.93888: results queue empty 10202 1727204053.93889: checking for any_errors_fatal 10202 1727204053.93896: done checking for any_errors_fatal 10202 1727204053.93897: checking for max_fail_percentage 10202 1727204053.93898: done checking for max_fail_percentage 10202 1727204053.93899: checking to see if all hosts have failed and the running result is not ok 10202 1727204053.93900: done checking to see if all hosts have failed 10202 1727204053.93901: getting the remaining hosts for this loop 10202 1727204053.93902: done getting the remaining hosts for this loop 10202 1727204053.93907: getting the next task for host managed-node3 10202 1727204053.93913: done getting next task for host managed-node3 10202 1727204053.93918: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10202 1727204053.93920: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204053.93940: getting variables 10202 1727204053.93944: in VariableManager get_vars() 10202 1727204053.93982: Calling all_inventory to load vars for managed-node3 10202 1727204053.93985: Calling groups_inventory to load vars for managed-node3 10202 1727204053.93987: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204053.93996: Calling all_plugins_play to load vars for managed-node3 10202 1727204053.93999: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204053.94001: Calling groups_plugins_play to load vars for managed-node3 10202 1727204053.95061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204053.96231: done with get_vars() 10202 1727204053.96255: done getting variables 10202 1727204053.96306: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.037) 0:00:15.638 ***** 10202 1727204053.96337: entering _queue_task() for managed-node3/fail 10202 1727204053.96608: worker is 1 (out of 1 available) 10202 1727204053.96622: exiting _queue_task() for managed-node3/fail 10202 1727204053.96637: done queuing things up, now waiting for results queue to drain 10202 1727204053.96638: waiting for pending results... 10202 1727204053.96820: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10202 1727204053.96913: in run() - task 127b8e07-fff9-0b04-2570-00000000002a 10202 1727204053.96926: variable 'ansible_search_path' from source: unknown 10202 1727204053.96932: variable 'ansible_search_path' from source: unknown 10202 1727204053.96961: calling self._execute() 10202 1727204053.97033: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204053.97036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204053.97046: variable 'omit' from source: magic vars 10202 1727204053.97342: variable 'ansible_distribution_major_version' from source: facts 10202 1727204053.97353: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204053.97489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204053.99152: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204053.99204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204053.99259: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204053.99263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204053.99284: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204053.99349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204053.99375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204053.99396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204053.99425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204053.99437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204053.99517: variable 'ansible_distribution_major_version' from source: facts 10202 1727204053.99533: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10202 1727204053.99625: variable 'ansible_distribution' from source: facts 10202 1727204053.99632: variable '__network_rh_distros' from source: role '' defaults 10202 1727204053.99638: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10202 1727204053.99641: when evaluation is False, skipping this task 10202 1727204053.99644: _execute() done 10202 1727204053.99649: dumping result to json 10202 1727204053.99652: done dumping result, returning 10202 1727204053.99660: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-0b04-2570-00000000002a] 10202 1727204053.99672: sending task result for task 127b8e07-fff9-0b04-2570-00000000002a 10202 1727204053.99767: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002a 10202 1727204053.99770: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10202 1727204053.99843: no more pending results, returning what we have 10202 1727204053.99847: results queue empty 10202 1727204053.99848: checking for any_errors_fatal 10202 1727204053.99853: done checking for any_errors_fatal 10202 1727204053.99854: checking for max_fail_percentage 10202 1727204053.99855: done checking for max_fail_percentage 10202 1727204053.99856: checking to see if all hosts have failed and the running result is not ok 10202 1727204053.99857: done checking to see if all hosts have failed 10202 1727204053.99858: getting the remaining hosts for this loop 10202 1727204053.99859: done getting the remaining hosts for this loop 10202 1727204053.99863: getting the next task for host managed-node3 10202 1727204053.99871: done getting next task for host managed-node3 10202 1727204053.99875: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10202 1727204053.99878: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204053.99900: getting variables 10202 1727204053.99901: in VariableManager get_vars() 10202 1727204053.99941: Calling all_inventory to load vars for managed-node3 10202 1727204053.99944: Calling groups_inventory to load vars for managed-node3 10202 1727204053.99946: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204053.99956: Calling all_plugins_play to load vars for managed-node3 10202 1727204053.99958: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204053.99961: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.00935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.02106: done with get_vars() 10202 1727204054.02135: done getting variables 10202 1727204054.02220: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.059) 0:00:15.697 ***** 10202 1727204054.02249: entering _queue_task() for managed-node3/dnf 10202 1727204054.02517: worker is 1 (out of 1 available) 10202 1727204054.02534: exiting _queue_task() for managed-node3/dnf 10202 1727204054.02547: done queuing things up, now waiting for results queue to drain 10202 1727204054.02548: waiting for pending results... 10202 1727204054.02739: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10202 1727204054.02836: in run() - task 127b8e07-fff9-0b04-2570-00000000002b 10202 1727204054.02848: variable 'ansible_search_path' from source: unknown 10202 1727204054.02851: variable 'ansible_search_path' from source: unknown 10202 1727204054.02890: calling self._execute() 10202 1727204054.02964: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.02972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.02982: variable 'omit' from source: magic vars 10202 1727204054.03282: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.03292: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.03451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204054.05403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204054.05464: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204054.05494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204054.05522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204054.05548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204054.05614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.05643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.05663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.05694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.05705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.05801: variable 'ansible_distribution' from source: facts 10202 1727204054.05805: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.05812: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10202 1727204054.05905: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204054.06002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.06021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.06041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.06076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.06085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.06118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.06137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.06155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.06187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.06198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.06228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.06246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.06264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.06296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.06307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.06438: variable 'network_connections' from source: task vars 10202 1727204054.06670: variable 'controller_profile' from source: play vars 10202 1727204054.06673: variable 'controller_profile' from source: play vars 10202 1727204054.06675: variable 'controller_device' from source: play vars 10202 1727204054.06678: variable 'controller_device' from source: play vars 10202 1727204054.06680: variable 'port1_profile' from source: play vars 10202 1727204054.06682: variable 'port1_profile' from source: play vars 10202 1727204054.06684: variable 'dhcp_interface1' from source: play vars 10202 1727204054.06743: variable 'dhcp_interface1' from source: play vars 10202 1727204054.06796: variable 'controller_profile' from source: play vars 10202 1727204054.06874: variable 'controller_profile' from source: play vars 10202 1727204054.06891: variable 'port2_profile' from source: play vars 10202 1727204054.06957: variable 'port2_profile' from source: play vars 10202 1727204054.06981: variable 'dhcp_interface2' from source: play vars 10202 1727204054.07048: variable 'dhcp_interface2' from source: play vars 10202 1727204054.07061: variable 'controller_profile' from source: play vars 10202 1727204054.07138: variable 'controller_profile' from source: play vars 10202 1727204054.07241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204054.07457: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204054.07534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204054.07563: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204054.07614: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204054.07653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204054.07672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204054.07691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.07711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204054.07767: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204054.07948: variable 'network_connections' from source: task vars 10202 1727204054.07954: variable 'controller_profile' from source: play vars 10202 1727204054.08004: variable 'controller_profile' from source: play vars 10202 1727204054.08011: variable 'controller_device' from source: play vars 10202 1727204054.08058: variable 'controller_device' from source: play vars 10202 1727204054.08068: variable 'port1_profile' from source: play vars 10202 1727204054.08113: variable 'port1_profile' from source: play vars 10202 1727204054.08119: variable 'dhcp_interface1' from source: play vars 10202 1727204054.08163: variable 'dhcp_interface1' from source: play vars 10202 1727204054.08177: variable 'controller_profile' from source: play vars 10202 1727204054.08222: variable 'controller_profile' from source: play vars 10202 1727204054.08231: variable 'port2_profile' from source: play vars 10202 1727204054.08274: variable 'port2_profile' from source: play vars 10202 1727204054.08283: variable 'dhcp_interface2' from source: play vars 10202 1727204054.08329: variable 'dhcp_interface2' from source: play vars 10202 1727204054.08333: variable 'controller_profile' from source: play vars 10202 1727204054.08378: variable 'controller_profile' from source: play vars 10202 1727204054.08407: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204054.08411: when evaluation is False, skipping this task 10202 1727204054.08414: _execute() done 10202 1727204054.08419: dumping result to json 10202 1727204054.08423: done dumping result, returning 10202 1727204054.08495: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-00000000002b] 10202 1727204054.08499: sending task result for task 127b8e07-fff9-0b04-2570-00000000002b 10202 1727204054.08572: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002b 10202 1727204054.08575: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204054.08630: no more pending results, returning what we have 10202 1727204054.08632: results queue empty 10202 1727204054.08634: checking for any_errors_fatal 10202 1727204054.08639: done checking for any_errors_fatal 10202 1727204054.08640: checking for max_fail_percentage 10202 1727204054.08641: done checking for max_fail_percentage 10202 1727204054.08642: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.08643: done checking to see if all hosts have failed 10202 1727204054.08644: getting the remaining hosts for this loop 10202 1727204054.08645: done getting the remaining hosts for this loop 10202 1727204054.08649: getting the next task for host managed-node3 10202 1727204054.08655: done getting next task for host managed-node3 10202 1727204054.08659: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10202 1727204054.08662: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.08678: getting variables 10202 1727204054.08679: in VariableManager get_vars() 10202 1727204054.08717: Calling all_inventory to load vars for managed-node3 10202 1727204054.08719: Calling groups_inventory to load vars for managed-node3 10202 1727204054.08721: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.08733: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.08735: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.08738: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.10035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.12178: done with get_vars() 10202 1727204054.12214: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10202 1727204054.12302: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.100) 0:00:15.798 ***** 10202 1727204054.12341: entering _queue_task() for managed-node3/yum 10202 1727204054.12343: Creating lock for yum 10202 1727204054.12710: worker is 1 (out of 1 available) 10202 1727204054.12723: exiting _queue_task() for managed-node3/yum 10202 1727204054.12739: done queuing things up, now waiting for results queue to drain 10202 1727204054.12740: waiting for pending results... 10202 1727204054.13189: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10202 1727204054.13195: in run() - task 127b8e07-fff9-0b04-2570-00000000002c 10202 1727204054.13218: variable 'ansible_search_path' from source: unknown 10202 1727204054.13225: variable 'ansible_search_path' from source: unknown 10202 1727204054.13271: calling self._execute() 10202 1727204054.13374: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.13500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.13503: variable 'omit' from source: magic vars 10202 1727204054.13838: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.13856: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.14071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204054.16596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204054.16687: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204054.16738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204054.16786: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204054.16821: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204054.16922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.16963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.17002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.17056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.17079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.17196: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.17224: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10202 1727204054.17236: when evaluation is False, skipping this task 10202 1727204054.17370: _execute() done 10202 1727204054.17373: dumping result to json 10202 1727204054.17376: done dumping result, returning 10202 1727204054.17379: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-00000000002c] 10202 1727204054.17381: sending task result for task 127b8e07-fff9-0b04-2570-00000000002c 10202 1727204054.17463: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002c 10202 1727204054.17469: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10202 1727204054.17526: no more pending results, returning what we have 10202 1727204054.17532: results queue empty 10202 1727204054.17533: checking for any_errors_fatal 10202 1727204054.17540: done checking for any_errors_fatal 10202 1727204054.17541: checking for max_fail_percentage 10202 1727204054.17543: done checking for max_fail_percentage 10202 1727204054.17545: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.17546: done checking to see if all hosts have failed 10202 1727204054.17546: getting the remaining hosts for this loop 10202 1727204054.17548: done getting the remaining hosts for this loop 10202 1727204054.17553: getting the next task for host managed-node3 10202 1727204054.17561: done getting next task for host managed-node3 10202 1727204054.17565: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10202 1727204054.17569: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.17584: getting variables 10202 1727204054.17585: in VariableManager get_vars() 10202 1727204054.17632: Calling all_inventory to load vars for managed-node3 10202 1727204054.17635: Calling groups_inventory to load vars for managed-node3 10202 1727204054.17637: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.17648: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.17650: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.17652: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.19971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.22905: done with get_vars() 10202 1727204054.23090: done getting variables 10202 1727204054.23218: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.109) 0:00:15.908 ***** 10202 1727204054.23302: entering _queue_task() for managed-node3/fail 10202 1727204054.23696: worker is 1 (out of 1 available) 10202 1727204054.23709: exiting _queue_task() for managed-node3/fail 10202 1727204054.23836: done queuing things up, now waiting for results queue to drain 10202 1727204054.23838: waiting for pending results... 10202 1727204054.24174: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10202 1727204054.24214: in run() - task 127b8e07-fff9-0b04-2570-00000000002d 10202 1727204054.24238: variable 'ansible_search_path' from source: unknown 10202 1727204054.24246: variable 'ansible_search_path' from source: unknown 10202 1727204054.24300: calling self._execute() 10202 1727204054.24411: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.24426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.24447: variable 'omit' from source: magic vars 10202 1727204054.24896: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.24924: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.25140: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204054.25306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204054.32439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204054.32536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204054.32617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204054.32668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204054.32708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204054.32811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.32852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.32886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.32948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.32972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.33043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.33076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.33106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.33164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.33185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.33247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.33345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.33353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.33367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.33388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.33603: variable 'network_connections' from source: task vars 10202 1727204054.33625: variable 'controller_profile' from source: play vars 10202 1727204054.33733: variable 'controller_profile' from source: play vars 10202 1727204054.33749: variable 'controller_device' from source: play vars 10202 1727204054.33842: variable 'controller_device' from source: play vars 10202 1727204054.33859: variable 'port1_profile' from source: play vars 10202 1727204054.33936: variable 'port1_profile' from source: play vars 10202 1727204054.33972: variable 'dhcp_interface1' from source: play vars 10202 1727204054.34026: variable 'dhcp_interface1' from source: play vars 10202 1727204054.34042: variable 'controller_profile' from source: play vars 10202 1727204054.34124: variable 'controller_profile' from source: play vars 10202 1727204054.34131: variable 'port2_profile' from source: play vars 10202 1727204054.34225: variable 'port2_profile' from source: play vars 10202 1727204054.34235: variable 'dhcp_interface2' from source: play vars 10202 1727204054.34293: variable 'dhcp_interface2' from source: play vars 10202 1727204054.34326: variable 'controller_profile' from source: play vars 10202 1727204054.34383: variable 'controller_profile' from source: play vars 10202 1727204054.34477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204054.34762: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204054.34766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204054.34791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204054.34822: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204054.34883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204054.34921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204054.34963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.35013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204054.35201: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204054.36068: variable 'network_connections' from source: task vars 10202 1727204054.36074: variable 'controller_profile' from source: play vars 10202 1727204054.36077: variable 'controller_profile' from source: play vars 10202 1727204054.36079: variable 'controller_device' from source: play vars 10202 1727204054.36241: variable 'controller_device' from source: play vars 10202 1727204054.36471: variable 'port1_profile' from source: play vars 10202 1727204054.36474: variable 'port1_profile' from source: play vars 10202 1727204054.36477: variable 'dhcp_interface1' from source: play vars 10202 1727204054.36558: variable 'dhcp_interface1' from source: play vars 10202 1727204054.36772: variable 'controller_profile' from source: play vars 10202 1727204054.36798: variable 'controller_profile' from source: play vars 10202 1727204054.36812: variable 'port2_profile' from source: play vars 10202 1727204054.36955: variable 'port2_profile' from source: play vars 10202 1727204054.36971: variable 'dhcp_interface2' from source: play vars 10202 1727204054.37111: variable 'dhcp_interface2' from source: play vars 10202 1727204054.37167: variable 'controller_profile' from source: play vars 10202 1727204054.37230: variable 'controller_profile' from source: play vars 10202 1727204054.37483: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204054.37486: when evaluation is False, skipping this task 10202 1727204054.37489: _execute() done 10202 1727204054.37491: dumping result to json 10202 1727204054.37493: done dumping result, returning 10202 1727204054.37496: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-00000000002d] 10202 1727204054.37498: sending task result for task 127b8e07-fff9-0b04-2570-00000000002d 10202 1727204054.37714: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002d 10202 1727204054.37718: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204054.37778: no more pending results, returning what we have 10202 1727204054.37782: results queue empty 10202 1727204054.37783: checking for any_errors_fatal 10202 1727204054.37789: done checking for any_errors_fatal 10202 1727204054.37790: checking for max_fail_percentage 10202 1727204054.37792: done checking for max_fail_percentage 10202 1727204054.37793: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.37795: done checking to see if all hosts have failed 10202 1727204054.37795: getting the remaining hosts for this loop 10202 1727204054.37797: done getting the remaining hosts for this loop 10202 1727204054.37802: getting the next task for host managed-node3 10202 1727204054.37809: done getting next task for host managed-node3 10202 1727204054.37814: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10202 1727204054.37817: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.37836: getting variables 10202 1727204054.37838: in VariableManager get_vars() 10202 1727204054.37888: Calling all_inventory to load vars for managed-node3 10202 1727204054.37892: Calling groups_inventory to load vars for managed-node3 10202 1727204054.37894: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.37907: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.37910: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.37913: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.40108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.42453: done with get_vars() 10202 1727204054.42497: done getting variables 10202 1727204054.42569: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.193) 0:00:16.101 ***** 10202 1727204054.42611: entering _queue_task() for managed-node3/package 10202 1727204054.42997: worker is 1 (out of 1 available) 10202 1727204054.43125: exiting _queue_task() for managed-node3/package 10202 1727204054.43139: done queuing things up, now waiting for results queue to drain 10202 1727204054.43140: waiting for pending results... 10202 1727204054.43355: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 10202 1727204054.43568: in run() - task 127b8e07-fff9-0b04-2570-00000000002e 10202 1727204054.43573: variable 'ansible_search_path' from source: unknown 10202 1727204054.43576: variable 'ansible_search_path' from source: unknown 10202 1727204054.43586: calling self._execute() 10202 1727204054.43693: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.43706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.43721: variable 'omit' from source: magic vars 10202 1727204054.44161: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.44221: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.44411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204054.44718: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204054.44784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204054.44824: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204054.44863: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204054.45071: variable 'network_packages' from source: role '' defaults 10202 1727204054.45130: variable '__network_provider_setup' from source: role '' defaults 10202 1727204054.45148: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204054.45235: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204054.45249: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204054.45322: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204054.45543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204054.55862: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204054.55943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204054.56215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204054.56262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204054.56382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204054.56482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.56664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.56671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.56688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.56699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.56781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.56855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.56893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.56926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.56951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.57331: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10202 1727204054.58070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.58076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.58079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.58082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.58084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.58174: variable 'ansible_python' from source: facts 10202 1727204054.58202: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10202 1727204054.58473: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204054.58667: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204054.58857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.58910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.58941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.58987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.59006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.59061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.59087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.59117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.59162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.59179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.59353: variable 'network_connections' from source: task vars 10202 1727204054.59359: variable 'controller_profile' from source: play vars 10202 1727204054.59481: variable 'controller_profile' from source: play vars 10202 1727204054.59493: variable 'controller_device' from source: play vars 10202 1727204054.59602: variable 'controller_device' from source: play vars 10202 1727204054.59615: variable 'port1_profile' from source: play vars 10202 1727204054.59727: variable 'port1_profile' from source: play vars 10202 1727204054.59739: variable 'dhcp_interface1' from source: play vars 10202 1727204054.59846: variable 'dhcp_interface1' from source: play vars 10202 1727204054.59882: variable 'controller_profile' from source: play vars 10202 1727204054.59967: variable 'controller_profile' from source: play vars 10202 1727204054.59978: variable 'port2_profile' from source: play vars 10202 1727204054.60098: variable 'port2_profile' from source: play vars 10202 1727204054.60102: variable 'dhcp_interface2' from source: play vars 10202 1727204054.60208: variable 'dhcp_interface2' from source: play vars 10202 1727204054.60270: variable 'controller_profile' from source: play vars 10202 1727204054.60333: variable 'controller_profile' from source: play vars 10202 1727204054.60457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204054.60536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204054.60574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.60617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204054.60660: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204054.61074: variable 'network_connections' from source: task vars 10202 1727204054.61078: variable 'controller_profile' from source: play vars 10202 1727204054.61272: variable 'controller_profile' from source: play vars 10202 1727204054.61276: variable 'controller_device' from source: play vars 10202 1727204054.61281: variable 'controller_device' from source: play vars 10202 1727204054.61298: variable 'port1_profile' from source: play vars 10202 1727204054.61406: variable 'port1_profile' from source: play vars 10202 1727204054.61508: variable 'dhcp_interface1' from source: play vars 10202 1727204054.61536: variable 'dhcp_interface1' from source: play vars 10202 1727204054.61545: variable 'controller_profile' from source: play vars 10202 1727204054.61654: variable 'controller_profile' from source: play vars 10202 1727204054.61664: variable 'port2_profile' from source: play vars 10202 1727204054.61787: variable 'port2_profile' from source: play vars 10202 1727204054.61796: variable 'dhcp_interface2' from source: play vars 10202 1727204054.61905: variable 'dhcp_interface2' from source: play vars 10202 1727204054.61943: variable 'controller_profile' from source: play vars 10202 1727204054.62171: variable 'controller_profile' from source: play vars 10202 1727204054.62174: variable '__network_packages_default_wireless' from source: role '' defaults 10202 1727204054.62177: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204054.62560: variable 'network_connections' from source: task vars 10202 1727204054.62564: variable 'controller_profile' from source: play vars 10202 1727204054.62770: variable 'controller_profile' from source: play vars 10202 1727204054.62859: variable 'controller_device' from source: play vars 10202 1727204054.62929: variable 'controller_device' from source: play vars 10202 1727204054.62942: variable 'port1_profile' from source: play vars 10202 1727204054.63104: variable 'port1_profile' from source: play vars 10202 1727204054.63111: variable 'dhcp_interface1' from source: play vars 10202 1727204054.63202: variable 'dhcp_interface1' from source: play vars 10202 1727204054.63209: variable 'controller_profile' from source: play vars 10202 1727204054.63283: variable 'controller_profile' from source: play vars 10202 1727204054.63477: variable 'port2_profile' from source: play vars 10202 1727204054.63480: variable 'port2_profile' from source: play vars 10202 1727204054.63483: variable 'dhcp_interface2' from source: play vars 10202 1727204054.63487: variable 'dhcp_interface2' from source: play vars 10202 1727204054.63489: variable 'controller_profile' from source: play vars 10202 1727204054.63509: variable 'controller_profile' from source: play vars 10202 1727204054.63542: variable '__network_packages_default_team' from source: role '' defaults 10202 1727204054.63630: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204054.63975: variable 'network_connections' from source: task vars 10202 1727204054.63979: variable 'controller_profile' from source: play vars 10202 1727204054.64175: variable 'controller_profile' from source: play vars 10202 1727204054.64179: variable 'controller_device' from source: play vars 10202 1727204054.64181: variable 'controller_device' from source: play vars 10202 1727204054.64183: variable 'port1_profile' from source: play vars 10202 1727204054.64224: variable 'port1_profile' from source: play vars 10202 1727204054.64240: variable 'dhcp_interface1' from source: play vars 10202 1727204054.64316: variable 'dhcp_interface1' from source: play vars 10202 1727204054.64330: variable 'controller_profile' from source: play vars 10202 1727204054.64436: variable 'controller_profile' from source: play vars 10202 1727204054.64494: variable 'port2_profile' from source: play vars 10202 1727204054.64568: variable 'port2_profile' from source: play vars 10202 1727204054.64586: variable 'dhcp_interface2' from source: play vars 10202 1727204054.64710: variable 'dhcp_interface2' from source: play vars 10202 1727204054.64721: variable 'controller_profile' from source: play vars 10202 1727204054.64797: variable 'controller_profile' from source: play vars 10202 1727204054.64887: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204054.64959: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204054.64974: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204054.65044: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204054.65280: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10202 1727204054.65861: variable 'network_connections' from source: task vars 10202 1727204054.65864: variable 'controller_profile' from source: play vars 10202 1727204054.65917: variable 'controller_profile' from source: play vars 10202 1727204054.65925: variable 'controller_device' from source: play vars 10202 1727204054.66014: variable 'controller_device' from source: play vars 10202 1727204054.66017: variable 'port1_profile' from source: play vars 10202 1727204054.66058: variable 'port1_profile' from source: play vars 10202 1727204054.66067: variable 'dhcp_interface1' from source: play vars 10202 1727204054.66134: variable 'dhcp_interface1' from source: play vars 10202 1727204054.66221: variable 'controller_profile' from source: play vars 10202 1727204054.66224: variable 'controller_profile' from source: play vars 10202 1727204054.66227: variable 'port2_profile' from source: play vars 10202 1727204054.66274: variable 'port2_profile' from source: play vars 10202 1727204054.66282: variable 'dhcp_interface2' from source: play vars 10202 1727204054.66347: variable 'dhcp_interface2' from source: play vars 10202 1727204054.66354: variable 'controller_profile' from source: play vars 10202 1727204054.66417: variable 'controller_profile' from source: play vars 10202 1727204054.66425: variable 'ansible_distribution' from source: facts 10202 1727204054.66438: variable '__network_rh_distros' from source: role '' defaults 10202 1727204054.66441: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.66469: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10202 1727204054.66917: variable 'ansible_distribution' from source: facts 10202 1727204054.66920: variable '__network_rh_distros' from source: role '' defaults 10202 1727204054.66922: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.66924: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10202 1727204054.66970: variable 'ansible_distribution' from source: facts 10202 1727204054.66974: variable '__network_rh_distros' from source: role '' defaults 10202 1727204054.66983: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.67172: variable 'network_provider' from source: set_fact 10202 1727204054.67175: variable 'ansible_facts' from source: unknown 10202 1727204054.67907: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10202 1727204054.67911: when evaluation is False, skipping this task 10202 1727204054.67914: _execute() done 10202 1727204054.67918: dumping result to json 10202 1727204054.67920: done dumping result, returning 10202 1727204054.67923: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-0b04-2570-00000000002e] 10202 1727204054.67932: sending task result for task 127b8e07-fff9-0b04-2570-00000000002e 10202 1727204054.68168: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002e 10202 1727204054.68172: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10202 1727204054.68312: no more pending results, returning what we have 10202 1727204054.68315: results queue empty 10202 1727204054.68316: checking for any_errors_fatal 10202 1727204054.68322: done checking for any_errors_fatal 10202 1727204054.68323: checking for max_fail_percentage 10202 1727204054.68324: done checking for max_fail_percentage 10202 1727204054.68325: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.68326: done checking to see if all hosts have failed 10202 1727204054.68329: getting the remaining hosts for this loop 10202 1727204054.68330: done getting the remaining hosts for this loop 10202 1727204054.68334: getting the next task for host managed-node3 10202 1727204054.68340: done getting next task for host managed-node3 10202 1727204054.68343: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10202 1727204054.68346: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.68361: getting variables 10202 1727204054.68362: in VariableManager get_vars() 10202 1727204054.68403: Calling all_inventory to load vars for managed-node3 10202 1727204054.68406: Calling groups_inventory to load vars for managed-node3 10202 1727204054.68409: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.68420: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.68423: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.68426: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.75291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.77772: done with get_vars() 10202 1727204054.77808: done getting variables 10202 1727204054.77878: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.352) 0:00:16.454 ***** 10202 1727204054.77911: entering _queue_task() for managed-node3/package 10202 1727204054.78295: worker is 1 (out of 1 available) 10202 1727204054.78310: exiting _queue_task() for managed-node3/package 10202 1727204054.78324: done queuing things up, now waiting for results queue to drain 10202 1727204054.78326: waiting for pending results... 10202 1727204054.78684: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10202 1727204054.78690: in run() - task 127b8e07-fff9-0b04-2570-00000000002f 10202 1727204054.78694: variable 'ansible_search_path' from source: unknown 10202 1727204054.78697: variable 'ansible_search_path' from source: unknown 10202 1727204054.78741: calling self._execute() 10202 1727204054.78846: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.78859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.78874: variable 'omit' from source: magic vars 10202 1727204054.79280: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.79298: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.79436: variable 'network_state' from source: role '' defaults 10202 1727204054.79453: Evaluated conditional (network_state != {}): False 10202 1727204054.79462: when evaluation is False, skipping this task 10202 1727204054.79473: _execute() done 10202 1727204054.79481: dumping result to json 10202 1727204054.79492: done dumping result, returning 10202 1727204054.79672: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-0b04-2570-00000000002f] 10202 1727204054.79675: sending task result for task 127b8e07-fff9-0b04-2570-00000000002f 10202 1727204054.79757: done sending task result for task 127b8e07-fff9-0b04-2570-00000000002f 10202 1727204054.79760: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204054.79813: no more pending results, returning what we have 10202 1727204054.79817: results queue empty 10202 1727204054.79818: checking for any_errors_fatal 10202 1727204054.79828: done checking for any_errors_fatal 10202 1727204054.79829: checking for max_fail_percentage 10202 1727204054.79831: done checking for max_fail_percentage 10202 1727204054.79832: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.79834: done checking to see if all hosts have failed 10202 1727204054.79835: getting the remaining hosts for this loop 10202 1727204054.79837: done getting the remaining hosts for this loop 10202 1727204054.79842: getting the next task for host managed-node3 10202 1727204054.79849: done getting next task for host managed-node3 10202 1727204054.79853: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10202 1727204054.79856: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.79877: getting variables 10202 1727204054.79879: in VariableManager get_vars() 10202 1727204054.79928: Calling all_inventory to load vars for managed-node3 10202 1727204054.79931: Calling groups_inventory to load vars for managed-node3 10202 1727204054.79934: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.79948: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.79951: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.79954: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.81929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.84053: done with get_vars() 10202 1727204054.84094: done getting variables 10202 1727204054.84160: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.062) 0:00:16.517 ***** 10202 1727204054.84201: entering _queue_task() for managed-node3/package 10202 1727204054.84797: worker is 1 (out of 1 available) 10202 1727204054.84806: exiting _queue_task() for managed-node3/package 10202 1727204054.84819: done queuing things up, now waiting for results queue to drain 10202 1727204054.84820: waiting for pending results... 10202 1727204054.84950: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10202 1727204054.85048: in run() - task 127b8e07-fff9-0b04-2570-000000000030 10202 1727204054.85072: variable 'ansible_search_path' from source: unknown 10202 1727204054.85080: variable 'ansible_search_path' from source: unknown 10202 1727204054.85124: calling self._execute() 10202 1727204054.85229: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.85245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.85370: variable 'omit' from source: magic vars 10202 1727204054.85691: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.85714: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.85856: variable 'network_state' from source: role '' defaults 10202 1727204054.85919: Evaluated conditional (network_state != {}): False 10202 1727204054.85922: when evaluation is False, skipping this task 10202 1727204054.85925: _execute() done 10202 1727204054.85927: dumping result to json 10202 1727204054.85929: done dumping result, returning 10202 1727204054.85932: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-0b04-2570-000000000030] 10202 1727204054.85935: sending task result for task 127b8e07-fff9-0b04-2570-000000000030 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204054.86248: no more pending results, returning what we have 10202 1727204054.86252: results queue empty 10202 1727204054.86253: checking for any_errors_fatal 10202 1727204054.86263: done checking for any_errors_fatal 10202 1727204054.86263: checking for max_fail_percentage 10202 1727204054.86267: done checking for max_fail_percentage 10202 1727204054.86268: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.86269: done checking to see if all hosts have failed 10202 1727204054.86271: getting the remaining hosts for this loop 10202 1727204054.86273: done getting the remaining hosts for this loop 10202 1727204054.86278: getting the next task for host managed-node3 10202 1727204054.86285: done getting next task for host managed-node3 10202 1727204054.86290: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10202 1727204054.86294: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.86312: getting variables 10202 1727204054.86314: in VariableManager get_vars() 10202 1727204054.86361: Calling all_inventory to load vars for managed-node3 10202 1727204054.86364: Calling groups_inventory to load vars for managed-node3 10202 1727204054.86552: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.86561: done sending task result for task 127b8e07-fff9-0b04-2570-000000000030 10202 1727204054.86567: WORKER PROCESS EXITING 10202 1727204054.86578: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.86582: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.86585: Calling groups_plugins_play to load vars for managed-node3 10202 1727204054.88200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204054.90438: done with get_vars() 10202 1727204054.90477: done getting variables 10202 1727204054.90587: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.064) 0:00:16.581 ***** 10202 1727204054.90620: entering _queue_task() for managed-node3/service 10202 1727204054.90622: Creating lock for service 10202 1727204054.90985: worker is 1 (out of 1 available) 10202 1727204054.90998: exiting _queue_task() for managed-node3/service 10202 1727204054.91012: done queuing things up, now waiting for results queue to drain 10202 1727204054.91013: waiting for pending results... 10202 1727204054.91310: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10202 1727204054.91452: in run() - task 127b8e07-fff9-0b04-2570-000000000031 10202 1727204054.91480: variable 'ansible_search_path' from source: unknown 10202 1727204054.91494: variable 'ansible_search_path' from source: unknown 10202 1727204054.91539: calling self._execute() 10202 1727204054.91649: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204054.91663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204054.91680: variable 'omit' from source: magic vars 10202 1727204054.92124: variable 'ansible_distribution_major_version' from source: facts 10202 1727204054.92148: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204054.92287: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204054.92524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204054.95596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204054.95621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204054.95690: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204054.95739: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204054.95777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204054.95880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.95923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.95956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.96007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.96035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.96136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.96140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.96160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.96208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.96229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.96289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204054.96319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204054.96462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.96465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204054.96470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204054.96630: variable 'network_connections' from source: task vars 10202 1727204054.96651: variable 'controller_profile' from source: play vars 10202 1727204054.96739: variable 'controller_profile' from source: play vars 10202 1727204054.96755: variable 'controller_device' from source: play vars 10202 1727204054.96834: variable 'controller_device' from source: play vars 10202 1727204054.96850: variable 'port1_profile' from source: play vars 10202 1727204054.96923: variable 'port1_profile' from source: play vars 10202 1727204054.96935: variable 'dhcp_interface1' from source: play vars 10202 1727204054.97006: variable 'dhcp_interface1' from source: play vars 10202 1727204054.97022: variable 'controller_profile' from source: play vars 10202 1727204054.97088: variable 'controller_profile' from source: play vars 10202 1727204054.97100: variable 'port2_profile' from source: play vars 10202 1727204054.97160: variable 'port2_profile' from source: play vars 10202 1727204054.97173: variable 'dhcp_interface2' from source: play vars 10202 1727204054.97234: variable 'dhcp_interface2' from source: play vars 10202 1727204054.97245: variable 'controller_profile' from source: play vars 10202 1727204054.97308: variable 'controller_profile' from source: play vars 10202 1727204054.97396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204054.97665: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204054.97670: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204054.97685: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204054.97719: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204054.97779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204054.97809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204054.97839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204054.97872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204054.97958: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204054.98243: variable 'network_connections' from source: task vars 10202 1727204054.98254: variable 'controller_profile' from source: play vars 10202 1727204054.98330: variable 'controller_profile' from source: play vars 10202 1727204054.98343: variable 'controller_device' from source: play vars 10202 1727204054.98412: variable 'controller_device' from source: play vars 10202 1727204054.98538: variable 'port1_profile' from source: play vars 10202 1727204054.98541: variable 'port1_profile' from source: play vars 10202 1727204054.98544: variable 'dhcp_interface1' from source: play vars 10202 1727204054.98574: variable 'dhcp_interface1' from source: play vars 10202 1727204054.98585: variable 'controller_profile' from source: play vars 10202 1727204054.98654: variable 'controller_profile' from source: play vars 10202 1727204054.98668: variable 'port2_profile' from source: play vars 10202 1727204054.98732: variable 'port2_profile' from source: play vars 10202 1727204054.98745: variable 'dhcp_interface2' from source: play vars 10202 1727204054.98815: variable 'dhcp_interface2' from source: play vars 10202 1727204054.98827: variable 'controller_profile' from source: play vars 10202 1727204054.98898: variable 'controller_profile' from source: play vars 10202 1727204054.98940: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204054.98949: when evaluation is False, skipping this task 10202 1727204054.98955: _execute() done 10202 1727204054.98962: dumping result to json 10202 1727204054.98974: done dumping result, returning 10202 1727204054.98986: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-000000000031] 10202 1727204054.98995: sending task result for task 127b8e07-fff9-0b04-2570-000000000031 10202 1727204054.99304: done sending task result for task 127b8e07-fff9-0b04-2570-000000000031 10202 1727204054.99307: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204054.99355: no more pending results, returning what we have 10202 1727204054.99358: results queue empty 10202 1727204054.99359: checking for any_errors_fatal 10202 1727204054.99368: done checking for any_errors_fatal 10202 1727204054.99369: checking for max_fail_percentage 10202 1727204054.99371: done checking for max_fail_percentage 10202 1727204054.99372: checking to see if all hosts have failed and the running result is not ok 10202 1727204054.99373: done checking to see if all hosts have failed 10202 1727204054.99374: getting the remaining hosts for this loop 10202 1727204054.99376: done getting the remaining hosts for this loop 10202 1727204054.99380: getting the next task for host managed-node3 10202 1727204054.99387: done getting next task for host managed-node3 10202 1727204054.99391: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10202 1727204054.99394: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204054.99410: getting variables 10202 1727204054.99411: in VariableManager get_vars() 10202 1727204054.99455: Calling all_inventory to load vars for managed-node3 10202 1727204054.99458: Calling groups_inventory to load vars for managed-node3 10202 1727204054.99461: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204054.99646: Calling all_plugins_play to load vars for managed-node3 10202 1727204054.99650: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204054.99654: Calling groups_plugins_play to load vars for managed-node3 10202 1727204055.01417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204055.03517: done with get_vars() 10202 1727204055.03556: done getting variables 10202 1727204055.03627: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.130) 0:00:16.711 ***** 10202 1727204055.03660: entering _queue_task() for managed-node3/service 10202 1727204055.04032: worker is 1 (out of 1 available) 10202 1727204055.04047: exiting _queue_task() for managed-node3/service 10202 1727204055.04062: done queuing things up, now waiting for results queue to drain 10202 1727204055.04063: waiting for pending results... 10202 1727204055.04367: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10202 1727204055.04519: in run() - task 127b8e07-fff9-0b04-2570-000000000032 10202 1727204055.04542: variable 'ansible_search_path' from source: unknown 10202 1727204055.04549: variable 'ansible_search_path' from source: unknown 10202 1727204055.04596: calling self._execute() 10202 1727204055.04711: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204055.04727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204055.04742: variable 'omit' from source: magic vars 10202 1727204055.05163: variable 'ansible_distribution_major_version' from source: facts 10202 1727204055.05184: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204055.05376: variable 'network_provider' from source: set_fact 10202 1727204055.05387: variable 'network_state' from source: role '' defaults 10202 1727204055.05470: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10202 1727204055.05474: variable 'omit' from source: magic vars 10202 1727204055.05484: variable 'omit' from source: magic vars 10202 1727204055.05519: variable 'network_service_name' from source: role '' defaults 10202 1727204055.05600: variable 'network_service_name' from source: role '' defaults 10202 1727204055.05723: variable '__network_provider_setup' from source: role '' defaults 10202 1727204055.05734: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204055.05809: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204055.05822: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204055.05891: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204055.06156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204055.08513: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204055.08615: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204055.08871: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204055.08875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204055.08877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204055.08880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204055.08883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204055.08908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.08957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204055.08981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204055.09043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204055.09077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204055.09114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.09160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204055.09184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204055.09458: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10202 1727204055.09610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204055.09643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204055.09683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.09731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204055.09752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204055.09861: variable 'ansible_python' from source: facts 10202 1727204055.09901: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10202 1727204055.10007: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204055.10101: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204055.10228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204055.10256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204055.10288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.10416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204055.10420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204055.10422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204055.10450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204055.10481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.10526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204055.10545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204055.10702: variable 'network_connections' from source: task vars 10202 1727204055.10715: variable 'controller_profile' from source: play vars 10202 1727204055.10804: variable 'controller_profile' from source: play vars 10202 1727204055.10823: variable 'controller_device' from source: play vars 10202 1727204055.10907: variable 'controller_device' from source: play vars 10202 1727204055.10926: variable 'port1_profile' from source: play vars 10202 1727204055.11009: variable 'port1_profile' from source: play vars 10202 1727204055.11066: variable 'dhcp_interface1' from source: play vars 10202 1727204055.11108: variable 'dhcp_interface1' from source: play vars 10202 1727204055.11123: variable 'controller_profile' from source: play vars 10202 1727204055.11203: variable 'controller_profile' from source: play vars 10202 1727204055.11220: variable 'port2_profile' from source: play vars 10202 1727204055.11302: variable 'port2_profile' from source: play vars 10202 1727204055.11318: variable 'dhcp_interface2' from source: play vars 10202 1727204055.11400: variable 'dhcp_interface2' from source: play vars 10202 1727204055.11499: variable 'controller_profile' from source: play vars 10202 1727204055.11502: variable 'controller_profile' from source: play vars 10202 1727204055.11618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204055.11853: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204055.11912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204055.11964: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204055.12015: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204055.12094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204055.12132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204055.12179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204055.12222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204055.12287: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204055.12618: variable 'network_connections' from source: task vars 10202 1727204055.12631: variable 'controller_profile' from source: play vars 10202 1727204055.12772: variable 'controller_profile' from source: play vars 10202 1727204055.12776: variable 'controller_device' from source: play vars 10202 1727204055.12824: variable 'controller_device' from source: play vars 10202 1727204055.12842: variable 'port1_profile' from source: play vars 10202 1727204055.12927: variable 'port1_profile' from source: play vars 10202 1727204055.12943: variable 'dhcp_interface1' from source: play vars 10202 1727204055.13026: variable 'dhcp_interface1' from source: play vars 10202 1727204055.13042: variable 'controller_profile' from source: play vars 10202 1727204055.13122: variable 'controller_profile' from source: play vars 10202 1727204055.13142: variable 'port2_profile' from source: play vars 10202 1727204055.13238: variable 'port2_profile' from source: play vars 10202 1727204055.13241: variable 'dhcp_interface2' from source: play vars 10202 1727204055.13316: variable 'dhcp_interface2' from source: play vars 10202 1727204055.13346: variable 'controller_profile' from source: play vars 10202 1727204055.13414: variable 'controller_profile' from source: play vars 10202 1727204055.13563: variable '__network_packages_default_wireless' from source: role '' defaults 10202 1727204055.13570: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204055.13895: variable 'network_connections' from source: task vars 10202 1727204055.13906: variable 'controller_profile' from source: play vars 10202 1727204055.13980: variable 'controller_profile' from source: play vars 10202 1727204055.13995: variable 'controller_device' from source: play vars 10202 1727204055.14072: variable 'controller_device' from source: play vars 10202 1727204055.14086: variable 'port1_profile' from source: play vars 10202 1727204055.14162: variable 'port1_profile' from source: play vars 10202 1727204055.14217: variable 'dhcp_interface1' from source: play vars 10202 1727204055.14253: variable 'dhcp_interface1' from source: play vars 10202 1727204055.14265: variable 'controller_profile' from source: play vars 10202 1727204055.14344: variable 'controller_profile' from source: play vars 10202 1727204055.14357: variable 'port2_profile' from source: play vars 10202 1727204055.14437: variable 'port2_profile' from source: play vars 10202 1727204055.14450: variable 'dhcp_interface2' from source: play vars 10202 1727204055.14527: variable 'dhcp_interface2' from source: play vars 10202 1727204055.14652: variable 'controller_profile' from source: play vars 10202 1727204055.14656: variable 'controller_profile' from source: play vars 10202 1727204055.14658: variable '__network_packages_default_team' from source: role '' defaults 10202 1727204055.14737: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204055.15252: variable 'network_connections' from source: task vars 10202 1727204055.15267: variable 'controller_profile' from source: play vars 10202 1727204055.15349: variable 'controller_profile' from source: play vars 10202 1727204055.15363: variable 'controller_device' from source: play vars 10202 1727204055.15445: variable 'controller_device' from source: play vars 10202 1727204055.15461: variable 'port1_profile' from source: play vars 10202 1727204055.15544: variable 'port1_profile' from source: play vars 10202 1727204055.15559: variable 'dhcp_interface1' from source: play vars 10202 1727204055.15642: variable 'dhcp_interface1' from source: play vars 10202 1727204055.15654: variable 'controller_profile' from source: play vars 10202 1727204055.15730: variable 'controller_profile' from source: play vars 10202 1727204055.15748: variable 'port2_profile' from source: play vars 10202 1727204055.15824: variable 'port2_profile' from source: play vars 10202 1727204055.15837: variable 'dhcp_interface2' from source: play vars 10202 1727204055.15919: variable 'dhcp_interface2' from source: play vars 10202 1727204055.15932: variable 'controller_profile' from source: play vars 10202 1727204055.16068: variable 'controller_profile' from source: play vars 10202 1727204055.16096: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204055.16171: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204055.16187: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204055.16251: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204055.16507: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10202 1727204055.17272: variable 'network_connections' from source: task vars 10202 1727204055.17275: variable 'controller_profile' from source: play vars 10202 1727204055.17278: variable 'controller_profile' from source: play vars 10202 1727204055.17280: variable 'controller_device' from source: play vars 10202 1727204055.17282: variable 'controller_device' from source: play vars 10202 1727204055.17284: variable 'port1_profile' from source: play vars 10202 1727204055.17302: variable 'port1_profile' from source: play vars 10202 1727204055.17314: variable 'dhcp_interface1' from source: play vars 10202 1727204055.17380: variable 'dhcp_interface1' from source: play vars 10202 1727204055.17394: variable 'controller_profile' from source: play vars 10202 1727204055.17461: variable 'controller_profile' from source: play vars 10202 1727204055.17509: variable 'port2_profile' from source: play vars 10202 1727204055.17550: variable 'port2_profile' from source: play vars 10202 1727204055.17563: variable 'dhcp_interface2' from source: play vars 10202 1727204055.17634: variable 'dhcp_interface2' from source: play vars 10202 1727204055.17646: variable 'controller_profile' from source: play vars 10202 1727204055.17711: variable 'controller_profile' from source: play vars 10202 1727204055.17835: variable 'ansible_distribution' from source: facts 10202 1727204055.17839: variable '__network_rh_distros' from source: role '' defaults 10202 1727204055.17841: variable 'ansible_distribution_major_version' from source: facts 10202 1727204055.17844: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10202 1727204055.17978: variable 'ansible_distribution' from source: facts 10202 1727204055.17988: variable '__network_rh_distros' from source: role '' defaults 10202 1727204055.17999: variable 'ansible_distribution_major_version' from source: facts 10202 1727204055.18012: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10202 1727204055.18209: variable 'ansible_distribution' from source: facts 10202 1727204055.18218: variable '__network_rh_distros' from source: role '' defaults 10202 1727204055.18228: variable 'ansible_distribution_major_version' from source: facts 10202 1727204055.18277: variable 'network_provider' from source: set_fact 10202 1727204055.18305: variable 'omit' from source: magic vars 10202 1727204055.18340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204055.18376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204055.18404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204055.18428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204055.18444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204055.18483: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204055.18496: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204055.18505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204055.18622: Set connection var ansible_shell_type to sh 10202 1727204055.18709: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204055.18712: Set connection var ansible_connection to ssh 10202 1727204055.18715: Set connection var ansible_shell_executable to /bin/sh 10202 1727204055.18717: Set connection var ansible_pipelining to False 10202 1727204055.18719: Set connection var ansible_timeout to 10 10202 1727204055.18721: variable 'ansible_shell_executable' from source: unknown 10202 1727204055.18723: variable 'ansible_connection' from source: unknown 10202 1727204055.18725: variable 'ansible_module_compression' from source: unknown 10202 1727204055.18727: variable 'ansible_shell_type' from source: unknown 10202 1727204055.18728: variable 'ansible_shell_executable' from source: unknown 10202 1727204055.18730: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204055.18732: variable 'ansible_pipelining' from source: unknown 10202 1727204055.18737: variable 'ansible_timeout' from source: unknown 10202 1727204055.18746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204055.18874: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204055.18891: variable 'omit' from source: magic vars 10202 1727204055.18894: starting attempt loop 10202 1727204055.18896: running the handler 10202 1727204055.18980: variable 'ansible_facts' from source: unknown 10202 1727204055.19580: _low_level_execute_command(): starting 10202 1727204055.19585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204055.20139: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.20145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204055.20148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.20205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204055.20213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204055.20216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204055.20293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204055.22121: stdout chunk (state=3): >>>/root <<< 10202 1727204055.22224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204055.22294: stderr chunk (state=3): >>><<< 10202 1727204055.22297: stdout chunk (state=3): >>><<< 10202 1727204055.22376: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204055.22385: _low_level_execute_command(): starting 10202 1727204055.22389: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675 `" && echo ansible-tmp-1727204055.2231343-11343-2708658724675="` echo /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675 `" ) && sleep 0' 10202 1727204055.22842: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.22846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.22849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.22851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.22899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204055.22903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204055.22909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204055.22983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204055.25148: stdout chunk (state=3): >>>ansible-tmp-1727204055.2231343-11343-2708658724675=/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675 <<< 10202 1727204055.25270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204055.25341: stderr chunk (state=3): >>><<< 10202 1727204055.25344: stdout chunk (state=3): >>><<< 10202 1727204055.25355: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204055.2231343-11343-2708658724675=/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204055.25395: variable 'ansible_module_compression' from source: unknown 10202 1727204055.25443: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 10202 1727204055.25447: ANSIBALLZ: Acquiring lock 10202 1727204055.25450: ANSIBALLZ: Lock acquired: 140045305564624 10202 1727204055.25453: ANSIBALLZ: Creating module 10202 1727204055.51077: ANSIBALLZ: Writing module into payload 10202 1727204055.51179: ANSIBALLZ: Writing module 10202 1727204055.51218: ANSIBALLZ: Renaming module 10202 1727204055.51224: ANSIBALLZ: Done creating module 10202 1727204055.51300: variable 'ansible_facts' from source: unknown 10202 1727204055.51606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py 10202 1727204055.51743: Sending initial data 10202 1727204055.51747: Sent initial data (154 bytes) 10202 1727204055.52249: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.52253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204055.52256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.52259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.52316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204055.52319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204055.52324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204055.52401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204055.54269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204055.54331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204055.54408: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp49te6k6b /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py <<< 10202 1727204055.54412: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py" <<< 10202 1727204055.54485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp49te6k6b" to remote "/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py" <<< 10202 1727204055.56337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204055.56377: stderr chunk (state=3): >>><<< 10202 1727204055.56391: stdout chunk (state=3): >>><<< 10202 1727204055.56422: done transferring module to remote 10202 1727204055.56447: _low_level_execute_command(): starting 10202 1727204055.56471: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/ /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py && sleep 0' 10202 1727204055.57234: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.57315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204055.57320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204055.57352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204055.57467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204055.59596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204055.59617: stderr chunk (state=3): >>><<< 10202 1727204055.59740: stdout chunk (state=3): >>><<< 10202 1727204055.59745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204055.59747: _low_level_execute_command(): starting 10202 1727204055.59750: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/AnsiballZ_systemd.py && sleep 0' 10202 1727204055.60390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204055.60414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204055.60433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204055.60520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204055.60571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204055.60601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204055.60711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204055.99210: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11616256", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509161984", "CPUUsageNSec": "801220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCOR<<< 10202 1727204055.99247: stdout chunk (state=3): >>>E": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.service network.target multi-user.target cloud-init.service", "After": "system.slice cloud-init-local.service basic.target dbus-broker.service network-pre.target systemd-journald.socket dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:23 EDT", "StateChangeTimestampMonotonic": "340960243", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10202 1727204056.01633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204056.01637: stderr chunk (state=3): >>><<< 10202 1727204056.01639: stdout chunk (state=3): >>><<< 10202 1727204056.01668: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11616256", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509161984", "CPUUsageNSec": "801220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.service network.target multi-user.target cloud-init.service", "After": "system.slice cloud-init-local.service basic.target dbus-broker.service network-pre.target systemd-journald.socket dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:23 EDT", "StateChangeTimestampMonotonic": "340960243", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204056.01907: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204056.01953: _low_level_execute_command(): starting 10202 1727204056.01956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204055.2231343-11343-2708658724675/ > /dev/null 2>&1 && sleep 0' 10202 1727204056.02687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.02719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.02723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.02734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.02826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204056.05099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204056.05103: stdout chunk (state=3): >>><<< 10202 1727204056.05106: stderr chunk (state=3): >>><<< 10202 1727204056.05174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204056.05177: handler run complete 10202 1727204056.05226: attempt loop complete, returning result 10202 1727204056.05384: _execute() done 10202 1727204056.05391: dumping result to json 10202 1727204056.05407: done dumping result, returning 10202 1727204056.05419: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-0b04-2570-000000000032] 10202 1727204056.05425: sending task result for task 127b8e07-fff9-0b04-2570-000000000032 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204056.06075: no more pending results, returning what we have 10202 1727204056.06078: results queue empty 10202 1727204056.06079: checking for any_errors_fatal 10202 1727204056.06084: done checking for any_errors_fatal 10202 1727204056.06084: checking for max_fail_percentage 10202 1727204056.06086: done checking for max_fail_percentage 10202 1727204056.06087: checking to see if all hosts have failed and the running result is not ok 10202 1727204056.06088: done checking to see if all hosts have failed 10202 1727204056.06089: getting the remaining hosts for this loop 10202 1727204056.06090: done getting the remaining hosts for this loop 10202 1727204056.06093: getting the next task for host managed-node3 10202 1727204056.06099: done getting next task for host managed-node3 10202 1727204056.06102: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10202 1727204056.06105: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204056.06114: getting variables 10202 1727204056.06115: in VariableManager get_vars() 10202 1727204056.06154: Calling all_inventory to load vars for managed-node3 10202 1727204056.06157: Calling groups_inventory to load vars for managed-node3 10202 1727204056.06159: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204056.06171: Calling all_plugins_play to load vars for managed-node3 10202 1727204056.06175: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204056.06178: Calling groups_plugins_play to load vars for managed-node3 10202 1727204056.06795: done sending task result for task 127b8e07-fff9-0b04-2570-000000000032 10202 1727204056.06799: WORKER PROCESS EXITING 10202 1727204056.09087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204056.11416: done with get_vars() 10202 1727204056.11454: done getting variables 10202 1727204056.11523: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:16 -0400 (0:00:01.078) 0:00:17.790 ***** 10202 1727204056.11562: entering _queue_task() for managed-node3/service 10202 1727204056.11963: worker is 1 (out of 1 available) 10202 1727204056.11978: exiting _queue_task() for managed-node3/service 10202 1727204056.11991: done queuing things up, now waiting for results queue to drain 10202 1727204056.11993: waiting for pending results... 10202 1727204056.12254: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10202 1727204056.12401: in run() - task 127b8e07-fff9-0b04-2570-000000000033 10202 1727204056.12424: variable 'ansible_search_path' from source: unknown 10202 1727204056.12435: variable 'ansible_search_path' from source: unknown 10202 1727204056.12485: calling self._execute() 10202 1727204056.12595: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.12615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.12632: variable 'omit' from source: magic vars 10202 1727204056.13161: variable 'ansible_distribution_major_version' from source: facts 10202 1727204056.13188: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204056.13349: variable 'network_provider' from source: set_fact 10202 1727204056.13361: Evaluated conditional (network_provider == "nm"): True 10202 1727204056.13473: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204056.13582: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204056.13856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204056.16390: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204056.16479: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204056.16526: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204056.16572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204056.16604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204056.16714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204056.16758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204056.16793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204056.16845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204056.16870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204056.16962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204056.16965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204056.16991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204056.17040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204056.17059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204056.17114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204056.17145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204056.17182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204056.17470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204056.17474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204056.17476: variable 'network_connections' from source: task vars 10202 1727204056.17479: variable 'controller_profile' from source: play vars 10202 1727204056.17510: variable 'controller_profile' from source: play vars 10202 1727204056.17525: variable 'controller_device' from source: play vars 10202 1727204056.17603: variable 'controller_device' from source: play vars 10202 1727204056.17619: variable 'port1_profile' from source: play vars 10202 1727204056.17689: variable 'port1_profile' from source: play vars 10202 1727204056.17706: variable 'dhcp_interface1' from source: play vars 10202 1727204056.17773: variable 'dhcp_interface1' from source: play vars 10202 1727204056.17786: variable 'controller_profile' from source: play vars 10202 1727204056.17856: variable 'controller_profile' from source: play vars 10202 1727204056.17870: variable 'port2_profile' from source: play vars 10202 1727204056.17940: variable 'port2_profile' from source: play vars 10202 1727204056.17953: variable 'dhcp_interface2' from source: play vars 10202 1727204056.18019: variable 'dhcp_interface2' from source: play vars 10202 1727204056.18039: variable 'controller_profile' from source: play vars 10202 1727204056.18105: variable 'controller_profile' from source: play vars 10202 1727204056.18197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204056.18401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204056.18451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204056.18495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204056.18532: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204056.18592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204056.18621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204056.18655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204056.18703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204056.18767: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204056.19047: variable 'network_connections' from source: task vars 10202 1727204056.19057: variable 'controller_profile' from source: play vars 10202 1727204056.19226: variable 'controller_profile' from source: play vars 10202 1727204056.19233: variable 'controller_device' from source: play vars 10202 1727204056.19235: variable 'controller_device' from source: play vars 10202 1727204056.19237: variable 'port1_profile' from source: play vars 10202 1727204056.19293: variable 'port1_profile' from source: play vars 10202 1727204056.19307: variable 'dhcp_interface1' from source: play vars 10202 1727204056.19381: variable 'dhcp_interface1' from source: play vars 10202 1727204056.19394: variable 'controller_profile' from source: play vars 10202 1727204056.19471: variable 'controller_profile' from source: play vars 10202 1727204056.19485: variable 'port2_profile' from source: play vars 10202 1727204056.19664: variable 'port2_profile' from source: play vars 10202 1727204056.19668: variable 'dhcp_interface2' from source: play vars 10202 1727204056.19672: variable 'dhcp_interface2' from source: play vars 10202 1727204056.19674: variable 'controller_profile' from source: play vars 10202 1727204056.19762: variable 'controller_profile' from source: play vars 10202 1727204056.19823: Evaluated conditional (__network_wpa_supplicant_required): False 10202 1727204056.19843: when evaluation is False, skipping this task 10202 1727204056.19851: _execute() done 10202 1727204056.19860: dumping result to json 10202 1727204056.19895: done dumping result, returning 10202 1727204056.19909: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-0b04-2570-000000000033] 10202 1727204056.19920: sending task result for task 127b8e07-fff9-0b04-2570-000000000033 10202 1727204056.20272: done sending task result for task 127b8e07-fff9-0b04-2570-000000000033 10202 1727204056.20276: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10202 1727204056.20323: no more pending results, returning what we have 10202 1727204056.20330: results queue empty 10202 1727204056.20331: checking for any_errors_fatal 10202 1727204056.20351: done checking for any_errors_fatal 10202 1727204056.20352: checking for max_fail_percentage 10202 1727204056.20354: done checking for max_fail_percentage 10202 1727204056.20355: checking to see if all hosts have failed and the running result is not ok 10202 1727204056.20356: done checking to see if all hosts have failed 10202 1727204056.20357: getting the remaining hosts for this loop 10202 1727204056.20359: done getting the remaining hosts for this loop 10202 1727204056.20365: getting the next task for host managed-node3 10202 1727204056.20373: done getting next task for host managed-node3 10202 1727204056.20378: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10202 1727204056.20381: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204056.20397: getting variables 10202 1727204056.20399: in VariableManager get_vars() 10202 1727204056.20447: Calling all_inventory to load vars for managed-node3 10202 1727204056.20451: Calling groups_inventory to load vars for managed-node3 10202 1727204056.20454: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204056.20509: Calling all_plugins_play to load vars for managed-node3 10202 1727204056.20519: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204056.20524: Calling groups_plugins_play to load vars for managed-node3 10202 1727204056.22418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204056.24748: done with get_vars() 10202 1727204056.24782: done getting variables 10202 1727204056.24853: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.133) 0:00:17.924 ***** 10202 1727204056.24896: entering _queue_task() for managed-node3/service 10202 1727204056.25371: worker is 1 (out of 1 available) 10202 1727204056.25385: exiting _queue_task() for managed-node3/service 10202 1727204056.25397: done queuing things up, now waiting for results queue to drain 10202 1727204056.25398: waiting for pending results... 10202 1727204056.25612: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 10202 1727204056.25758: in run() - task 127b8e07-fff9-0b04-2570-000000000034 10202 1727204056.25782: variable 'ansible_search_path' from source: unknown 10202 1727204056.25789: variable 'ansible_search_path' from source: unknown 10202 1727204056.25830: calling self._execute() 10202 1727204056.25956: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.25975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.25990: variable 'omit' from source: magic vars 10202 1727204056.26447: variable 'ansible_distribution_major_version' from source: facts 10202 1727204056.26469: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204056.26672: variable 'network_provider' from source: set_fact 10202 1727204056.26676: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204056.26679: when evaluation is False, skipping this task 10202 1727204056.26681: _execute() done 10202 1727204056.26683: dumping result to json 10202 1727204056.26686: done dumping result, returning 10202 1727204056.26689: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-0b04-2570-000000000034] 10202 1727204056.26692: sending task result for task 127b8e07-fff9-0b04-2570-000000000034 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204056.26883: no more pending results, returning what we have 10202 1727204056.26887: results queue empty 10202 1727204056.26888: checking for any_errors_fatal 10202 1727204056.26899: done checking for any_errors_fatal 10202 1727204056.26900: checking for max_fail_percentage 10202 1727204056.26902: done checking for max_fail_percentage 10202 1727204056.26903: checking to see if all hosts have failed and the running result is not ok 10202 1727204056.26904: done checking to see if all hosts have failed 10202 1727204056.26905: getting the remaining hosts for this loop 10202 1727204056.26907: done getting the remaining hosts for this loop 10202 1727204056.26912: getting the next task for host managed-node3 10202 1727204056.26920: done getting next task for host managed-node3 10202 1727204056.26924: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10202 1727204056.26928: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204056.26948: getting variables 10202 1727204056.26950: in VariableManager get_vars() 10202 1727204056.27001: Calling all_inventory to load vars for managed-node3 10202 1727204056.27005: Calling groups_inventory to load vars for managed-node3 10202 1727204056.27008: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204056.27024: Calling all_plugins_play to load vars for managed-node3 10202 1727204056.27027: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204056.27030: Calling groups_plugins_play to load vars for managed-node3 10202 1727204056.27784: done sending task result for task 127b8e07-fff9-0b04-2570-000000000034 10202 1727204056.27789: WORKER PROCESS EXITING 10202 1727204056.29035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204056.31242: done with get_vars() 10202 1727204056.31286: done getting variables 10202 1727204056.31353: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.064) 0:00:17.989 ***** 10202 1727204056.31397: entering _queue_task() for managed-node3/copy 10202 1727204056.31813: worker is 1 (out of 1 available) 10202 1727204056.31941: exiting _queue_task() for managed-node3/copy 10202 1727204056.31953: done queuing things up, now waiting for results queue to drain 10202 1727204056.31955: waiting for pending results... 10202 1727204056.32115: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10202 1727204056.32289: in run() - task 127b8e07-fff9-0b04-2570-000000000035 10202 1727204056.32314: variable 'ansible_search_path' from source: unknown 10202 1727204056.32320: variable 'ansible_search_path' from source: unknown 10202 1727204056.32361: calling self._execute() 10202 1727204056.32463: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.32479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.32495: variable 'omit' from source: magic vars 10202 1727204056.32978: variable 'ansible_distribution_major_version' from source: facts 10202 1727204056.33000: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204056.33154: variable 'network_provider' from source: set_fact 10202 1727204056.33174: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204056.33183: when evaluation is False, skipping this task 10202 1727204056.33190: _execute() done 10202 1727204056.33198: dumping result to json 10202 1727204056.33206: done dumping result, returning 10202 1727204056.33223: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-0b04-2570-000000000035] 10202 1727204056.33235: sending task result for task 127b8e07-fff9-0b04-2570-000000000035 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10202 1727204056.33524: no more pending results, returning what we have 10202 1727204056.33528: results queue empty 10202 1727204056.33530: checking for any_errors_fatal 10202 1727204056.33537: done checking for any_errors_fatal 10202 1727204056.33538: checking for max_fail_percentage 10202 1727204056.33540: done checking for max_fail_percentage 10202 1727204056.33541: checking to see if all hosts have failed and the running result is not ok 10202 1727204056.33543: done checking to see if all hosts have failed 10202 1727204056.33543: getting the remaining hosts for this loop 10202 1727204056.33545: done getting the remaining hosts for this loop 10202 1727204056.33551: getting the next task for host managed-node3 10202 1727204056.33559: done getting next task for host managed-node3 10202 1727204056.33563: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10202 1727204056.33569: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204056.33589: getting variables 10202 1727204056.33591: in VariableManager get_vars() 10202 1727204056.33641: Calling all_inventory to load vars for managed-node3 10202 1727204056.33645: Calling groups_inventory to load vars for managed-node3 10202 1727204056.33648: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204056.33664: Calling all_plugins_play to load vars for managed-node3 10202 1727204056.33786: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204056.33792: done sending task result for task 127b8e07-fff9-0b04-2570-000000000035 10202 1727204056.33795: WORKER PROCESS EXITING 10202 1727204056.33800: Calling groups_plugins_play to load vars for managed-node3 10202 1727204056.35763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204056.40177: done with get_vars() 10202 1727204056.40221: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.090) 0:00:18.079 ***** 10202 1727204056.40442: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10202 1727204056.40444: Creating lock for fedora.linux_system_roles.network_connections 10202 1727204056.41246: worker is 1 (out of 1 available) 10202 1727204056.41264: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10202 1727204056.41373: done queuing things up, now waiting for results queue to drain 10202 1727204056.41375: waiting for pending results... 10202 1727204056.41813: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10202 1727204056.42324: in run() - task 127b8e07-fff9-0b04-2570-000000000036 10202 1727204056.42378: variable 'ansible_search_path' from source: unknown 10202 1727204056.42451: variable 'ansible_search_path' from source: unknown 10202 1727204056.42585: calling self._execute() 10202 1727204056.43036: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.43043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.43046: variable 'omit' from source: magic vars 10202 1727204056.43429: variable 'ansible_distribution_major_version' from source: facts 10202 1727204056.43451: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204056.43475: variable 'omit' from source: magic vars 10202 1727204056.43551: variable 'omit' from source: magic vars 10202 1727204056.43760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204056.46356: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204056.46445: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204056.46494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204056.46545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204056.46580: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204056.46683: variable 'network_provider' from source: set_fact 10202 1727204056.46855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204056.46902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204056.46964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204056.46991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204056.47010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204056.47104: variable 'omit' from source: magic vars 10202 1727204056.47241: variable 'omit' from source: magic vars 10202 1727204056.47399: variable 'network_connections' from source: task vars 10202 1727204056.47403: variable 'controller_profile' from source: play vars 10202 1727204056.47454: variable 'controller_profile' from source: play vars 10202 1727204056.47508: variable 'controller_device' from source: play vars 10202 1727204056.47543: variable 'controller_device' from source: play vars 10202 1727204056.47559: variable 'port1_profile' from source: play vars 10202 1727204056.47634: variable 'port1_profile' from source: play vars 10202 1727204056.47647: variable 'dhcp_interface1' from source: play vars 10202 1727204056.47713: variable 'dhcp_interface1' from source: play vars 10202 1727204056.47729: variable 'controller_profile' from source: play vars 10202 1727204056.47833: variable 'controller_profile' from source: play vars 10202 1727204056.47836: variable 'port2_profile' from source: play vars 10202 1727204056.47881: variable 'port2_profile' from source: play vars 10202 1727204056.47893: variable 'dhcp_interface2' from source: play vars 10202 1727204056.47963: variable 'dhcp_interface2' from source: play vars 10202 1727204056.47977: variable 'controller_profile' from source: play vars 10202 1727204056.48042: variable 'controller_profile' from source: play vars 10202 1727204056.48250: variable 'omit' from source: magic vars 10202 1727204056.48264: variable '__lsr_ansible_managed' from source: task vars 10202 1727204056.48340: variable '__lsr_ansible_managed' from source: task vars 10202 1727204056.48560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10202 1727204056.48799: Loaded config def from plugin (lookup/template) 10202 1727204056.48808: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10202 1727204056.48848: File lookup term: get_ansible_managed.j2 10202 1727204056.48855: variable 'ansible_search_path' from source: unknown 10202 1727204056.48867: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10202 1727204056.48887: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10202 1727204056.48935: variable 'ansible_search_path' from source: unknown 10202 1727204056.56716: variable 'ansible_managed' from source: unknown 10202 1727204056.56959: variable 'omit' from source: magic vars 10202 1727204056.57062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204056.57068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204056.57071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204056.57073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204056.57085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204056.57122: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204056.57132: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.57140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.57260: Set connection var ansible_shell_type to sh 10202 1727204056.57277: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204056.57288: Set connection var ansible_connection to ssh 10202 1727204056.57373: Set connection var ansible_shell_executable to /bin/sh 10202 1727204056.57386: Set connection var ansible_pipelining to False 10202 1727204056.57389: Set connection var ansible_timeout to 10 10202 1727204056.57391: variable 'ansible_shell_executable' from source: unknown 10202 1727204056.57393: variable 'ansible_connection' from source: unknown 10202 1727204056.57395: variable 'ansible_module_compression' from source: unknown 10202 1727204056.57397: variable 'ansible_shell_type' from source: unknown 10202 1727204056.57405: variable 'ansible_shell_executable' from source: unknown 10202 1727204056.57408: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204056.57410: variable 'ansible_pipelining' from source: unknown 10202 1727204056.57412: variable 'ansible_timeout' from source: unknown 10202 1727204056.57414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204056.57626: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204056.57629: variable 'omit' from source: magic vars 10202 1727204056.57634: starting attempt loop 10202 1727204056.57636: running the handler 10202 1727204056.57639: _low_level_execute_command(): starting 10202 1727204056.57732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204056.58490: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.58578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.58605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.58635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.58769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204056.60604: stdout chunk (state=3): >>>/root <<< 10202 1727204056.60814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204056.60818: stdout chunk (state=3): >>><<< 10202 1727204056.60821: stderr chunk (state=3): >>><<< 10202 1727204056.60842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204056.60859: _low_level_execute_command(): starting 10202 1727204056.60954: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159 `" && echo ansible-tmp-1727204056.6084826-11397-265340748842159="` echo /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159 `" ) && sleep 0' 10202 1727204056.61589: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.61616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.61680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.61684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.61751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204056.63989: stdout chunk (state=3): >>>ansible-tmp-1727204056.6084826-11397-265340748842159=/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159 <<< 10202 1727204056.64210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204056.64214: stdout chunk (state=3): >>><<< 10202 1727204056.64217: stderr chunk (state=3): >>><<< 10202 1727204056.64243: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204056.6084826-11397-265340748842159=/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204056.64370: variable 'ansible_module_compression' from source: unknown 10202 1727204056.64380: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 10202 1727204056.64390: ANSIBALLZ: Acquiring lock 10202 1727204056.64398: ANSIBALLZ: Lock acquired: 140045301771984 10202 1727204056.64405: ANSIBALLZ: Creating module 10202 1727204056.83735: ANSIBALLZ: Writing module into payload 10202 1727204056.83972: ANSIBALLZ: Writing module 10202 1727204056.83996: ANSIBALLZ: Renaming module 10202 1727204056.84002: ANSIBALLZ: Done creating module 10202 1727204056.84025: variable 'ansible_facts' from source: unknown 10202 1727204056.84105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py 10202 1727204056.84222: Sending initial data 10202 1727204056.84226: Sent initial data (168 bytes) 10202 1727204056.84739: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204056.84744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204056.84757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.84827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.84830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.84833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.84925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204056.86748: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204056.86811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204056.86872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpeto83_g8 /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py <<< 10202 1727204056.86882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py" <<< 10202 1727204056.86934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpeto83_g8" to remote "/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py" <<< 10202 1727204056.86942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py" <<< 10202 1727204056.87809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204056.87884: stderr chunk (state=3): >>><<< 10202 1727204056.87888: stdout chunk (state=3): >>><<< 10202 1727204056.87909: done transferring module to remote 10202 1727204056.87920: _low_level_execute_command(): starting 10202 1727204056.87925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/ /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py && sleep 0' 10202 1727204056.88431: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204056.88435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.88438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204056.88440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.88490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.88494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.88496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.88570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204056.90610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204056.90671: stderr chunk (state=3): >>><<< 10202 1727204056.90675: stdout chunk (state=3): >>><<< 10202 1727204056.90690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204056.90693: _low_level_execute_command(): starting 10202 1727204056.90699: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/AnsiballZ_network_connections.py && sleep 0' 10202 1727204056.91203: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204056.91207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.91211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204056.91213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204056.91215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204056.91275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204056.91282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204056.91284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204056.91358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204057.52183: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 10202 1727204057.52210: stdout chunk (state=3): >>> <<< 10202 1727204057.54798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204057.54803: stdout chunk (state=3): >>><<< 10202 1727204057.54805: stderr chunk (state=3): >>><<< 10202 1727204057.54974: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204057.54979: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204057.54987: _low_level_execute_command(): starting 10202 1727204057.54990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204056.6084826-11397-265340748842159/ > /dev/null 2>&1 && sleep 0' 10202 1727204057.55706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204057.55744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204057.55748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204057.55787: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204057.55791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204057.55851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204057.55864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204057.55940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204057.58042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204057.58106: stderr chunk (state=3): >>><<< 10202 1727204057.58110: stdout chunk (state=3): >>><<< 10202 1727204057.58125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204057.58134: handler run complete 10202 1727204057.58168: attempt loop complete, returning result 10202 1727204057.58172: _execute() done 10202 1727204057.58174: dumping result to json 10202 1727204057.58181: done dumping result, returning 10202 1727204057.58190: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-0b04-2570-000000000036] 10202 1727204057.58195: sending task result for task 127b8e07-fff9-0b04-2570-000000000036 10202 1727204057.58321: done sending task result for task 127b8e07-fff9-0b04-2570-000000000036 10202 1727204057.58324: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active) 10202 1727204057.58477: no more pending results, returning what we have 10202 1727204057.58481: results queue empty 10202 1727204057.58482: checking for any_errors_fatal 10202 1727204057.58490: done checking for any_errors_fatal 10202 1727204057.58491: checking for max_fail_percentage 10202 1727204057.58492: done checking for max_fail_percentage 10202 1727204057.58493: checking to see if all hosts have failed and the running result is not ok 10202 1727204057.58494: done checking to see if all hosts have failed 10202 1727204057.58494: getting the remaining hosts for this loop 10202 1727204057.58496: done getting the remaining hosts for this loop 10202 1727204057.58500: getting the next task for host managed-node3 10202 1727204057.58506: done getting next task for host managed-node3 10202 1727204057.58510: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10202 1727204057.58512: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204057.58523: getting variables 10202 1727204057.58524: in VariableManager get_vars() 10202 1727204057.58573: Calling all_inventory to load vars for managed-node3 10202 1727204057.58576: Calling groups_inventory to load vars for managed-node3 10202 1727204057.58578: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204057.58588: Calling all_plugins_play to load vars for managed-node3 10202 1727204057.58591: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204057.58593: Calling groups_plugins_play to load vars for managed-node3 10202 1727204057.60202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204057.61568: done with get_vars() 10202 1727204057.61592: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:17 -0400 (0:00:01.212) 0:00:19.292 ***** 10202 1727204057.61670: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10202 1727204057.61672: Creating lock for fedora.linux_system_roles.network_state 10202 1727204057.61959: worker is 1 (out of 1 available) 10202 1727204057.61976: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10202 1727204057.61990: done queuing things up, now waiting for results queue to drain 10202 1727204057.61992: waiting for pending results... 10202 1727204057.62184: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 10202 1727204057.62277: in run() - task 127b8e07-fff9-0b04-2570-000000000037 10202 1727204057.62289: variable 'ansible_search_path' from source: unknown 10202 1727204057.62293: variable 'ansible_search_path' from source: unknown 10202 1727204057.62335: calling self._execute() 10202 1727204057.62672: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.62676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.62679: variable 'omit' from source: magic vars 10202 1727204057.62857: variable 'ansible_distribution_major_version' from source: facts 10202 1727204057.62879: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204057.63012: variable 'network_state' from source: role '' defaults 10202 1727204057.63027: Evaluated conditional (network_state != {}): False 10202 1727204057.63034: when evaluation is False, skipping this task 10202 1727204057.63041: _execute() done 10202 1727204057.63046: dumping result to json 10202 1727204057.63052: done dumping result, returning 10202 1727204057.63062: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-0b04-2570-000000000037] 10202 1727204057.63075: sending task result for task 127b8e07-fff9-0b04-2570-000000000037 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204057.63246: no more pending results, returning what we have 10202 1727204057.63250: results queue empty 10202 1727204057.63251: checking for any_errors_fatal 10202 1727204057.63268: done checking for any_errors_fatal 10202 1727204057.63269: checking for max_fail_percentage 10202 1727204057.63271: done checking for max_fail_percentage 10202 1727204057.63272: checking to see if all hosts have failed and the running result is not ok 10202 1727204057.63273: done checking to see if all hosts have failed 10202 1727204057.63274: getting the remaining hosts for this loop 10202 1727204057.63275: done getting the remaining hosts for this loop 10202 1727204057.63280: getting the next task for host managed-node3 10202 1727204057.63287: done getting next task for host managed-node3 10202 1727204057.63291: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10202 1727204057.63294: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204057.63316: getting variables 10202 1727204057.63318: in VariableManager get_vars() 10202 1727204057.63364: Calling all_inventory to load vars for managed-node3 10202 1727204057.63474: Calling groups_inventory to load vars for managed-node3 10202 1727204057.63477: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204057.63488: Calling all_plugins_play to load vars for managed-node3 10202 1727204057.63490: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204057.63494: Calling groups_plugins_play to load vars for managed-node3 10202 1727204057.64016: done sending task result for task 127b8e07-fff9-0b04-2570-000000000037 10202 1727204057.64020: WORKER PROCESS EXITING 10202 1727204057.65285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204057.67367: done with get_vars() 10202 1727204057.67408: done getting variables 10202 1727204057.67480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.058) 0:00:19.350 ***** 10202 1727204057.67518: entering _queue_task() for managed-node3/debug 10202 1727204057.67891: worker is 1 (out of 1 available) 10202 1727204057.67906: exiting _queue_task() for managed-node3/debug 10202 1727204057.67921: done queuing things up, now waiting for results queue to drain 10202 1727204057.67923: waiting for pending results... 10202 1727204057.68238: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10202 1727204057.68386: in run() - task 127b8e07-fff9-0b04-2570-000000000038 10202 1727204057.68409: variable 'ansible_search_path' from source: unknown 10202 1727204057.68417: variable 'ansible_search_path' from source: unknown 10202 1727204057.68458: calling self._execute() 10202 1727204057.68580: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.68595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.68615: variable 'omit' from source: magic vars 10202 1727204057.69057: variable 'ansible_distribution_major_version' from source: facts 10202 1727204057.69154: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204057.69158: variable 'omit' from source: magic vars 10202 1727204057.69161: variable 'omit' from source: magic vars 10202 1727204057.69207: variable 'omit' from source: magic vars 10202 1727204057.69261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204057.69312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204057.69341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204057.69370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.69394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.69434: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204057.69443: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.69492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.69579: Set connection var ansible_shell_type to sh 10202 1727204057.69596: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204057.69612: Set connection var ansible_connection to ssh 10202 1727204057.69623: Set connection var ansible_shell_executable to /bin/sh 10202 1727204057.69635: Set connection var ansible_pipelining to False 10202 1727204057.69713: Set connection var ansible_timeout to 10 10202 1727204057.69717: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.69719: variable 'ansible_connection' from source: unknown 10202 1727204057.69722: variable 'ansible_module_compression' from source: unknown 10202 1727204057.69724: variable 'ansible_shell_type' from source: unknown 10202 1727204057.69726: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.69729: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.69731: variable 'ansible_pipelining' from source: unknown 10202 1727204057.69733: variable 'ansible_timeout' from source: unknown 10202 1727204057.69735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.69895: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204057.69914: variable 'omit' from source: magic vars 10202 1727204057.69927: starting attempt loop 10202 1727204057.69939: running the handler 10202 1727204057.70096: variable '__network_connections_result' from source: set_fact 10202 1727204057.70178: handler run complete 10202 1727204057.70205: attempt loop complete, returning result 10202 1727204057.70262: _execute() done 10202 1727204057.70267: dumping result to json 10202 1727204057.70271: done dumping result, returning 10202 1727204057.70274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-0b04-2570-000000000038] 10202 1727204057.70276: sending task result for task 127b8e07-fff9-0b04-2570-000000000038 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active)" ] } 10202 1727204057.70440: no more pending results, returning what we have 10202 1727204057.70444: results queue empty 10202 1727204057.70445: checking for any_errors_fatal 10202 1727204057.70453: done checking for any_errors_fatal 10202 1727204057.70454: checking for max_fail_percentage 10202 1727204057.70455: done checking for max_fail_percentage 10202 1727204057.70457: checking to see if all hosts have failed and the running result is not ok 10202 1727204057.70458: done checking to see if all hosts have failed 10202 1727204057.70459: getting the remaining hosts for this loop 10202 1727204057.70460: done getting the remaining hosts for this loop 10202 1727204057.70468: getting the next task for host managed-node3 10202 1727204057.70475: done getting next task for host managed-node3 10202 1727204057.70480: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10202 1727204057.70484: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204057.70497: getting variables 10202 1727204057.70499: in VariableManager get_vars() 10202 1727204057.70545: Calling all_inventory to load vars for managed-node3 10202 1727204057.70549: Calling groups_inventory to load vars for managed-node3 10202 1727204057.70551: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204057.70774: Calling all_plugins_play to load vars for managed-node3 10202 1727204057.70779: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204057.70785: done sending task result for task 127b8e07-fff9-0b04-2570-000000000038 10202 1727204057.70788: WORKER PROCESS EXITING 10202 1727204057.70793: Calling groups_plugins_play to load vars for managed-node3 10202 1727204057.72529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204057.73724: done with get_vars() 10202 1727204057.73754: done getting variables 10202 1727204057.73807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.063) 0:00:19.413 ***** 10202 1727204057.73839: entering _queue_task() for managed-node3/debug 10202 1727204057.74168: worker is 1 (out of 1 available) 10202 1727204057.74183: exiting _queue_task() for managed-node3/debug 10202 1727204057.74196: done queuing things up, now waiting for results queue to drain 10202 1727204057.74198: waiting for pending results... 10202 1727204057.74633: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10202 1727204057.74792: in run() - task 127b8e07-fff9-0b04-2570-000000000039 10202 1727204057.74817: variable 'ansible_search_path' from source: unknown 10202 1727204057.75071: variable 'ansible_search_path' from source: unknown 10202 1727204057.75076: calling self._execute() 10202 1727204057.75161: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.75177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.75202: variable 'omit' from source: magic vars 10202 1727204057.75893: variable 'ansible_distribution_major_version' from source: facts 10202 1727204057.75915: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204057.75932: variable 'omit' from source: magic vars 10202 1727204057.76006: variable 'omit' from source: magic vars 10202 1727204057.76057: variable 'omit' from source: magic vars 10202 1727204057.76097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204057.76132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204057.76148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204057.76163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.76177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.76208: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204057.76212: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.76215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.76295: Set connection var ansible_shell_type to sh 10202 1727204057.76299: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204057.76306: Set connection var ansible_connection to ssh 10202 1727204057.76312: Set connection var ansible_shell_executable to /bin/sh 10202 1727204057.76317: Set connection var ansible_pipelining to False 10202 1727204057.76323: Set connection var ansible_timeout to 10 10202 1727204057.76350: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.76354: variable 'ansible_connection' from source: unknown 10202 1727204057.76357: variable 'ansible_module_compression' from source: unknown 10202 1727204057.76359: variable 'ansible_shell_type' from source: unknown 10202 1727204057.76362: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.76364: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.76368: variable 'ansible_pipelining' from source: unknown 10202 1727204057.76370: variable 'ansible_timeout' from source: unknown 10202 1727204057.76373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.76492: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204057.76503: variable 'omit' from source: magic vars 10202 1727204057.76508: starting attempt loop 10202 1727204057.76511: running the handler 10202 1727204057.76559: variable '__network_connections_result' from source: set_fact 10202 1727204057.76628: variable '__network_connections_result' from source: set_fact 10202 1727204057.76760: handler run complete 10202 1727204057.76786: attempt loop complete, returning result 10202 1727204057.76789: _execute() done 10202 1727204057.76792: dumping result to json 10202 1727204057.76797: done dumping result, returning 10202 1727204057.76805: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-0b04-2570-000000000039] 10202 1727204057.76811: sending task result for task 127b8e07-fff9-0b04-2570-000000000039 10202 1727204057.76918: done sending task result for task 127b8e07-fff9-0b04-2570-000000000039 10202 1727204057.76921: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, f32dee73-d17d-466b-80a4-4a2bab216d3b (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e3635d21-1c21-4d1f-ab3f-feb9091d21d0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c296b684-3111-41df-a255-0a2b3f77cf01 (not-active)" ] } } 10202 1727204057.77026: no more pending results, returning what we have 10202 1727204057.77029: results queue empty 10202 1727204057.77037: checking for any_errors_fatal 10202 1727204057.77042: done checking for any_errors_fatal 10202 1727204057.77043: checking for max_fail_percentage 10202 1727204057.77044: done checking for max_fail_percentage 10202 1727204057.77045: checking to see if all hosts have failed and the running result is not ok 10202 1727204057.77046: done checking to see if all hosts have failed 10202 1727204057.77047: getting the remaining hosts for this loop 10202 1727204057.77049: done getting the remaining hosts for this loop 10202 1727204057.77053: getting the next task for host managed-node3 10202 1727204057.77058: done getting next task for host managed-node3 10202 1727204057.77062: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10202 1727204057.77072: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204057.77084: getting variables 10202 1727204057.77085: in VariableManager get_vars() 10202 1727204057.77121: Calling all_inventory to load vars for managed-node3 10202 1727204057.77124: Calling groups_inventory to load vars for managed-node3 10202 1727204057.77126: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204057.77135: Calling all_plugins_play to load vars for managed-node3 10202 1727204057.77138: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204057.77140: Calling groups_plugins_play to load vars for managed-node3 10202 1727204057.78356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204057.81477: done with get_vars() 10202 1727204057.81512: done getting variables 10202 1727204057.81615: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.078) 0:00:19.491 ***** 10202 1727204057.81657: entering _queue_task() for managed-node3/debug 10202 1727204057.82069: worker is 1 (out of 1 available) 10202 1727204057.82086: exiting _queue_task() for managed-node3/debug 10202 1727204057.82102: done queuing things up, now waiting for results queue to drain 10202 1727204057.82103: waiting for pending results... 10202 1727204057.82559: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10202 1727204057.82593: in run() - task 127b8e07-fff9-0b04-2570-00000000003a 10202 1727204057.82654: variable 'ansible_search_path' from source: unknown 10202 1727204057.82658: variable 'ansible_search_path' from source: unknown 10202 1727204057.82675: calling self._execute() 10202 1727204057.82784: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.82798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.82813: variable 'omit' from source: magic vars 10202 1727204057.83369: variable 'ansible_distribution_major_version' from source: facts 10202 1727204057.83420: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204057.83550: variable 'network_state' from source: role '' defaults 10202 1727204057.83573: Evaluated conditional (network_state != {}): False 10202 1727204057.83581: when evaluation is False, skipping this task 10202 1727204057.83588: _execute() done 10202 1727204057.83637: dumping result to json 10202 1727204057.83640: done dumping result, returning 10202 1727204057.83643: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-0b04-2570-00000000003a] 10202 1727204057.83646: sending task result for task 127b8e07-fff9-0b04-2570-00000000003a 10202 1727204057.83863: done sending task result for task 127b8e07-fff9-0b04-2570-00000000003a 10202 1727204057.83868: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 10202 1727204057.83920: no more pending results, returning what we have 10202 1727204057.83925: results queue empty 10202 1727204057.83926: checking for any_errors_fatal 10202 1727204057.83935: done checking for any_errors_fatal 10202 1727204057.83936: checking for max_fail_percentage 10202 1727204057.83937: done checking for max_fail_percentage 10202 1727204057.83938: checking to see if all hosts have failed and the running result is not ok 10202 1727204057.83939: done checking to see if all hosts have failed 10202 1727204057.83941: getting the remaining hosts for this loop 10202 1727204057.83942: done getting the remaining hosts for this loop 10202 1727204057.83948: getting the next task for host managed-node3 10202 1727204057.83954: done getting next task for host managed-node3 10202 1727204057.83961: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10202 1727204057.83964: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204057.83984: getting variables 10202 1727204057.83986: in VariableManager get_vars() 10202 1727204057.84038: Calling all_inventory to load vars for managed-node3 10202 1727204057.84042: Calling groups_inventory to load vars for managed-node3 10202 1727204057.84044: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204057.84059: Calling all_plugins_play to load vars for managed-node3 10202 1727204057.84062: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204057.84071: Calling groups_plugins_play to load vars for managed-node3 10202 1727204057.86237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204057.88715: done with get_vars() 10202 1727204057.88760: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.073) 0:00:19.565 ***** 10202 1727204057.89038: entering _queue_task() for managed-node3/ping 10202 1727204057.89040: Creating lock for ping 10202 1727204057.89452: worker is 1 (out of 1 available) 10202 1727204057.89576: exiting _queue_task() for managed-node3/ping 10202 1727204057.89587: done queuing things up, now waiting for results queue to drain 10202 1727204057.89588: waiting for pending results... 10202 1727204057.89817: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 10202 1727204057.89973: in run() - task 127b8e07-fff9-0b04-2570-00000000003b 10202 1727204057.89997: variable 'ansible_search_path' from source: unknown 10202 1727204057.90092: variable 'ansible_search_path' from source: unknown 10202 1727204057.90126: calling self._execute() 10202 1727204057.90235: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.90308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.90311: variable 'omit' from source: magic vars 10202 1727204057.90792: variable 'ansible_distribution_major_version' from source: facts 10202 1727204057.90811: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204057.90823: variable 'omit' from source: magic vars 10202 1727204057.90900: variable 'omit' from source: magic vars 10202 1727204057.90943: variable 'omit' from source: magic vars 10202 1727204057.90999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204057.91072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204057.91076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204057.91096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.91112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204057.91149: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204057.91157: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.91181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.91291: Set connection var ansible_shell_type to sh 10202 1727204057.91307: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204057.91369: Set connection var ansible_connection to ssh 10202 1727204057.91373: Set connection var ansible_shell_executable to /bin/sh 10202 1727204057.91375: Set connection var ansible_pipelining to False 10202 1727204057.91378: Set connection var ansible_timeout to 10 10202 1727204057.91380: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.91382: variable 'ansible_connection' from source: unknown 10202 1727204057.91384: variable 'ansible_module_compression' from source: unknown 10202 1727204057.91389: variable 'ansible_shell_type' from source: unknown 10202 1727204057.91400: variable 'ansible_shell_executable' from source: unknown 10202 1727204057.91410: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204057.91419: variable 'ansible_pipelining' from source: unknown 10202 1727204057.91425: variable 'ansible_timeout' from source: unknown 10202 1727204057.91433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204057.91672: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204057.91691: variable 'omit' from source: magic vars 10202 1727204057.91725: starting attempt loop 10202 1727204057.91728: running the handler 10202 1727204057.91734: _low_level_execute_command(): starting 10202 1727204057.91743: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204057.92592: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204057.92648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204057.92675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204057.92871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204057.94640: stdout chunk (state=3): >>>/root <<< 10202 1727204057.94814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204057.94818: stderr chunk (state=3): >>><<< 10202 1727204057.94822: stdout chunk (state=3): >>><<< 10202 1727204057.94876: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204057.94880: _low_level_execute_command(): starting 10202 1727204057.94883: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061 `" && echo ansible-tmp-1727204057.9484816-11454-90003471793061="` echo /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061 `" ) && sleep 0' 10202 1727204057.95674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204057.95678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204057.95680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204057.95683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204057.95685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204057.95687: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204057.95689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204057.95701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204057.95705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204057.95721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204057.95750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204057.95857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204057.98067: stdout chunk (state=3): >>>ansible-tmp-1727204057.9484816-11454-90003471793061=/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061 <<< 10202 1727204057.98196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204057.98297: stderr chunk (state=3): >>><<< 10202 1727204057.98308: stdout chunk (state=3): >>><<< 10202 1727204057.98337: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.9484816-11454-90003471793061=/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204057.98406: variable 'ansible_module_compression' from source: unknown 10202 1727204057.98463: ANSIBALLZ: Using lock for ping 10202 1727204057.98671: ANSIBALLZ: Acquiring lock 10202 1727204057.98675: ANSIBALLZ: Lock acquired: 140045303860352 10202 1727204057.98677: ANSIBALLZ: Creating module 10202 1727204058.15445: ANSIBALLZ: Writing module into payload 10202 1727204058.15595: ANSIBALLZ: Writing module 10202 1727204058.15702: ANSIBALLZ: Renaming module 10202 1727204058.15715: ANSIBALLZ: Done creating module 10202 1727204058.15755: variable 'ansible_facts' from source: unknown 10202 1727204058.15912: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py 10202 1727204058.16213: Sending initial data 10202 1727204058.16216: Sent initial data (152 bytes) 10202 1727204058.18030: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.18051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.18165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.19987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204058.20051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204058.20185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpdak4rap6 /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py <<< 10202 1727204058.20189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py" <<< 10202 1727204058.20260: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpdak4rap6" to remote "/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py" <<< 10202 1727204058.22495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.22499: stderr chunk (state=3): >>><<< 10202 1727204058.22501: stdout chunk (state=3): >>><<< 10202 1727204058.22515: done transferring module to remote 10202 1727204058.22528: _low_level_execute_command(): starting 10202 1727204058.22602: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/ /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py && sleep 0' 10202 1727204058.23894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.23951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204058.23984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.24000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.24269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.26638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.26642: stdout chunk (state=3): >>><<< 10202 1727204058.26645: stderr chunk (state=3): >>><<< 10202 1727204058.26647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204058.26650: _low_level_execute_command(): starting 10202 1727204058.26652: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/AnsiballZ_ping.py && sleep 0' 10202 1727204058.27858: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.27862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.27866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.27872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.28190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.28612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.46000: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10202 1727204058.47535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204058.47604: stderr chunk (state=3): >>><<< 10202 1727204058.47608: stdout chunk (state=3): >>><<< 10202 1727204058.47626: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204058.47653: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204058.47659: _low_level_execute_command(): starting 10202 1727204058.47664: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.9484816-11454-90003471793061/ > /dev/null 2>&1 && sleep 0' 10202 1727204058.48160: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204058.48167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204058.48187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204058.48190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.48249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204058.48253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.48335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.50527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.50532: stdout chunk (state=3): >>><<< 10202 1727204058.50535: stderr chunk (state=3): >>><<< 10202 1727204058.50629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204058.50637: handler run complete 10202 1727204058.50640: attempt loop complete, returning result 10202 1727204058.50642: _execute() done 10202 1727204058.50644: dumping result to json 10202 1727204058.50647: done dumping result, returning 10202 1727204058.50649: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-0b04-2570-00000000003b] 10202 1727204058.50651: sending task result for task 127b8e07-fff9-0b04-2570-00000000003b 10202 1727204058.50726: done sending task result for task 127b8e07-fff9-0b04-2570-00000000003b 10202 1727204058.50730: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 10202 1727204058.50825: no more pending results, returning what we have 10202 1727204058.50829: results queue empty 10202 1727204058.50831: checking for any_errors_fatal 10202 1727204058.50836: done checking for any_errors_fatal 10202 1727204058.50836: checking for max_fail_percentage 10202 1727204058.50838: done checking for max_fail_percentage 10202 1727204058.50839: checking to see if all hosts have failed and the running result is not ok 10202 1727204058.50840: done checking to see if all hosts have failed 10202 1727204058.50841: getting the remaining hosts for this loop 10202 1727204058.50842: done getting the remaining hosts for this loop 10202 1727204058.50847: getting the next task for host managed-node3 10202 1727204058.50856: done getting next task for host managed-node3 10202 1727204058.50858: ^ task is: TASK: meta (role_complete) 10202 1727204058.50861: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204058.50874: getting variables 10202 1727204058.50875: in VariableManager get_vars() 10202 1727204058.50919: Calling all_inventory to load vars for managed-node3 10202 1727204058.50922: Calling groups_inventory to load vars for managed-node3 10202 1727204058.50924: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204058.50936: Calling all_plugins_play to load vars for managed-node3 10202 1727204058.50938: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204058.50941: Calling groups_plugins_play to load vars for managed-node3 10202 1727204058.53043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204058.54495: done with get_vars() 10202 1727204058.54524: done getting variables 10202 1727204058.54597: done queuing things up, now waiting for results queue to drain 10202 1727204058.54599: results queue empty 10202 1727204058.54600: checking for any_errors_fatal 10202 1727204058.54602: done checking for any_errors_fatal 10202 1727204058.54603: checking for max_fail_percentage 10202 1727204058.54603: done checking for max_fail_percentage 10202 1727204058.54604: checking to see if all hosts have failed and the running result is not ok 10202 1727204058.54604: done checking to see if all hosts have failed 10202 1727204058.54605: getting the remaining hosts for this loop 10202 1727204058.54606: done getting the remaining hosts for this loop 10202 1727204058.54608: getting the next task for host managed-node3 10202 1727204058.54612: done getting next task for host managed-node3 10202 1727204058.54613: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10202 1727204058.54615: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204058.54617: getting variables 10202 1727204058.54618: in VariableManager get_vars() 10202 1727204058.54630: Calling all_inventory to load vars for managed-node3 10202 1727204058.54631: Calling groups_inventory to load vars for managed-node3 10202 1727204058.54633: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204058.54637: Calling all_plugins_play to load vars for managed-node3 10202 1727204058.54638: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204058.54641: Calling groups_plugins_play to load vars for managed-node3 10202 1727204058.55478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204058.56704: done with get_vars() 10202 1727204058.56724: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.677) 0:00:20.243 ***** 10202 1727204058.56787: entering _queue_task() for managed-node3/include_tasks 10202 1727204058.57074: worker is 1 (out of 1 available) 10202 1727204058.57090: exiting _queue_task() for managed-node3/include_tasks 10202 1727204058.57105: done queuing things up, now waiting for results queue to drain 10202 1727204058.57106: waiting for pending results... 10202 1727204058.57314: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10202 1727204058.57406: in run() - task 127b8e07-fff9-0b04-2570-00000000006e 10202 1727204058.57415: variable 'ansible_search_path' from source: unknown 10202 1727204058.57419: variable 'ansible_search_path' from source: unknown 10202 1727204058.57457: calling self._execute() 10202 1727204058.57540: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204058.57548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204058.57558: variable 'omit' from source: magic vars 10202 1727204058.57877: variable 'ansible_distribution_major_version' from source: facts 10202 1727204058.57888: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204058.57894: _execute() done 10202 1727204058.57897: dumping result to json 10202 1727204058.57902: done dumping result, returning 10202 1727204058.57908: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-0b04-2570-00000000006e] 10202 1727204058.57914: sending task result for task 127b8e07-fff9-0b04-2570-00000000006e 10202 1727204058.58019: done sending task result for task 127b8e07-fff9-0b04-2570-00000000006e 10202 1727204058.58022: WORKER PROCESS EXITING 10202 1727204058.58053: no more pending results, returning what we have 10202 1727204058.58058: in VariableManager get_vars() 10202 1727204058.58107: Calling all_inventory to load vars for managed-node3 10202 1727204058.58111: Calling groups_inventory to load vars for managed-node3 10202 1727204058.58113: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204058.58127: Calling all_plugins_play to load vars for managed-node3 10202 1727204058.58129: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204058.58140: Calling groups_plugins_play to load vars for managed-node3 10202 1727204058.59137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204058.60285: done with get_vars() 10202 1727204058.60314: variable 'ansible_search_path' from source: unknown 10202 1727204058.60315: variable 'ansible_search_path' from source: unknown 10202 1727204058.60348: we have included files to process 10202 1727204058.60349: generating all_blocks data 10202 1727204058.60351: done generating all_blocks data 10202 1727204058.60354: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204058.60355: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204058.60357: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10202 1727204058.60505: done processing included file 10202 1727204058.60507: iterating over new_blocks loaded from include file 10202 1727204058.60508: in VariableManager get_vars() 10202 1727204058.60525: done with get_vars() 10202 1727204058.60526: filtering new block on tags 10202 1727204058.60540: done filtering new block on tags 10202 1727204058.60541: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10202 1727204058.60546: extending task lists for all hosts with included blocks 10202 1727204058.60617: done extending task lists 10202 1727204058.60618: done processing included files 10202 1727204058.60618: results queue empty 10202 1727204058.60619: checking for any_errors_fatal 10202 1727204058.60620: done checking for any_errors_fatal 10202 1727204058.60620: checking for max_fail_percentage 10202 1727204058.60621: done checking for max_fail_percentage 10202 1727204058.60622: checking to see if all hosts have failed and the running result is not ok 10202 1727204058.60622: done checking to see if all hosts have failed 10202 1727204058.60623: getting the remaining hosts for this loop 10202 1727204058.60624: done getting the remaining hosts for this loop 10202 1727204058.60626: getting the next task for host managed-node3 10202 1727204058.60629: done getting next task for host managed-node3 10202 1727204058.60632: ^ task is: TASK: Get stat for interface {{ interface }} 10202 1727204058.60635: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204058.60637: getting variables 10202 1727204058.60638: in VariableManager get_vars() 10202 1727204058.60648: Calling all_inventory to load vars for managed-node3 10202 1727204058.60650: Calling groups_inventory to load vars for managed-node3 10202 1727204058.60651: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204058.60656: Calling all_plugins_play to load vars for managed-node3 10202 1727204058.60657: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204058.60659: Calling groups_plugins_play to load vars for managed-node3 10202 1727204058.66284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204058.68367: done with get_vars() 10202 1727204058.68407: done getting variables 10202 1727204058.68584: variable 'interface' from source: task vars 10202 1727204058.68588: variable 'controller_device' from source: play vars 10202 1727204058.68656: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.118) 0:00:20.362 ***** 10202 1727204058.68690: entering _queue_task() for managed-node3/stat 10202 1727204058.69085: worker is 1 (out of 1 available) 10202 1727204058.69101: exiting _queue_task() for managed-node3/stat 10202 1727204058.69116: done queuing things up, now waiting for results queue to drain 10202 1727204058.69118: waiting for pending results... 10202 1727204058.69654: running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond 10202 1727204058.69745: in run() - task 127b8e07-fff9-0b04-2570-000000000241 10202 1727204058.69749: variable 'ansible_search_path' from source: unknown 10202 1727204058.69772: variable 'ansible_search_path' from source: unknown 10202 1727204058.69901: calling self._execute() 10202 1727204058.69955: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204058.69976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204058.70014: variable 'omit' from source: magic vars 10202 1727204058.70670: variable 'ansible_distribution_major_version' from source: facts 10202 1727204058.70675: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204058.70679: variable 'omit' from source: magic vars 10202 1727204058.70707: variable 'omit' from source: magic vars 10202 1727204058.70823: variable 'interface' from source: task vars 10202 1727204058.70827: variable 'controller_device' from source: play vars 10202 1727204058.70907: variable 'controller_device' from source: play vars 10202 1727204058.70928: variable 'omit' from source: magic vars 10202 1727204058.70978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204058.71072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204058.71075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204058.71078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204058.71080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204058.71106: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204058.71110: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204058.71112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204058.71238: Set connection var ansible_shell_type to sh 10202 1727204058.71247: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204058.71252: Set connection var ansible_connection to ssh 10202 1727204058.71258: Set connection var ansible_shell_executable to /bin/sh 10202 1727204058.71264: Set connection var ansible_pipelining to False 10202 1727204058.71319: Set connection var ansible_timeout to 10 10202 1727204058.71322: variable 'ansible_shell_executable' from source: unknown 10202 1727204058.71325: variable 'ansible_connection' from source: unknown 10202 1727204058.71327: variable 'ansible_module_compression' from source: unknown 10202 1727204058.71329: variable 'ansible_shell_type' from source: unknown 10202 1727204058.71332: variable 'ansible_shell_executable' from source: unknown 10202 1727204058.71335: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204058.71338: variable 'ansible_pipelining' from source: unknown 10202 1727204058.71342: variable 'ansible_timeout' from source: unknown 10202 1727204058.71345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204058.71543: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204058.71840: variable 'omit' from source: magic vars 10202 1727204058.71843: starting attempt loop 10202 1727204058.71846: running the handler 10202 1727204058.71849: _low_level_execute_command(): starting 10202 1727204058.71851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204058.72469: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.72538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.72564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.72671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.75705: stdout chunk (state=3): >>>/root <<< 10202 1727204058.75969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.75973: stdout chunk (state=3): >>><<< 10202 1727204058.75977: stderr chunk (state=3): >>><<< 10202 1727204058.76001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204058.76025: _low_level_execute_command(): starting 10202 1727204058.76038: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017 `" && echo ansible-tmp-1727204058.7601035-11493-204960579278017="` echo /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017 `" ) && sleep 0' 10202 1727204058.76750: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204058.76777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204058.76796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.76817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204058.76843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204058.76949: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.76973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.77085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.79279: stdout chunk (state=3): >>>ansible-tmp-1727204058.7601035-11493-204960579278017=/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017 <<< 10202 1727204058.79472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.79680: stdout chunk (state=3): >>><<< 10202 1727204058.79684: stderr chunk (state=3): >>><<< 10202 1727204058.79688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204058.7601035-11493-204960579278017=/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204058.79691: variable 'ansible_module_compression' from source: unknown 10202 1727204058.79694: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204058.79707: variable 'ansible_facts' from source: unknown 10202 1727204058.79792: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py 10202 1727204058.80055: Sending initial data 10202 1727204058.80059: Sent initial data (153 bytes) 10202 1727204058.80680: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204058.80707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204058.80800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.80817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.80849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204058.80880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.80898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.81020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.82811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204058.82905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204058.82985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmplrb5rzcr /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py <<< 10202 1727204058.82988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py" <<< 10202 1727204058.83042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmplrb5rzcr" to remote "/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py" <<< 10202 1727204058.84173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.84177: stdout chunk (state=3): >>><<< 10202 1727204058.84180: stderr chunk (state=3): >>><<< 10202 1727204058.84182: done transferring module to remote 10202 1727204058.84184: _low_level_execute_command(): starting 10202 1727204058.84187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/ /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py && sleep 0' 10202 1727204058.84788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.84830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204058.84851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.84860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.84958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204058.87088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204058.87095: stderr chunk (state=3): >>><<< 10202 1727204058.87098: stdout chunk (state=3): >>><<< 10202 1727204058.87174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204058.87178: _low_level_execute_command(): starting 10202 1727204058.87181: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/AnsiballZ_stat.py && sleep 0' 10202 1727204058.87764: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204058.87776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204058.87787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.87802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204058.87825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204058.87831: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204058.87834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.87920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204058.87923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204058.87936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204058.87938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204058.87940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204058.87942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204058.87944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204058.87946: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204058.87948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204058.87991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204058.88007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204058.88031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204058.88137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.06214: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35990, "dev": 23, "nlink": 1, "atime": 1727204057.2294788, "mtime": 1727204057.2294788, "ctime": 1727204057.2294788, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204059.07902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204059.07906: stdout chunk (state=3): >>><<< 10202 1727204059.07908: stderr chunk (state=3): >>><<< 10202 1727204059.08072: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35990, "dev": 23, "nlink": 1, "atime": 1727204057.2294788, "mtime": 1727204057.2294788, "ctime": 1727204057.2294788, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204059.08076: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204059.08084: _low_level_execute_command(): starting 10202 1727204059.08086: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204058.7601035-11493-204960579278017/ > /dev/null 2>&1 && sleep 0' 10202 1727204059.08724: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204059.08752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204059.08775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.08887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.08904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.08930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.09038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.11374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.11380: stdout chunk (state=3): >>><<< 10202 1727204059.11383: stderr chunk (state=3): >>><<< 10202 1727204059.11386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.11388: handler run complete 10202 1727204059.11391: attempt loop complete, returning result 10202 1727204059.11393: _execute() done 10202 1727204059.11395: dumping result to json 10202 1727204059.11398: done dumping result, returning 10202 1727204059.11400: done running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond [127b8e07-fff9-0b04-2570-000000000241] 10202 1727204059.11402: sending task result for task 127b8e07-fff9-0b04-2570-000000000241 ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204057.2294788, "block_size": 4096, "blocks": 0, "ctime": 1727204057.2294788, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35990, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204057.2294788, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10202 1727204059.11698: no more pending results, returning what we have 10202 1727204059.11702: results queue empty 10202 1727204059.11704: checking for any_errors_fatal 10202 1727204059.11705: done checking for any_errors_fatal 10202 1727204059.11706: checking for max_fail_percentage 10202 1727204059.11708: done checking for max_fail_percentage 10202 1727204059.11709: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.11710: done checking to see if all hosts have failed 10202 1727204059.11711: getting the remaining hosts for this loop 10202 1727204059.11713: done getting the remaining hosts for this loop 10202 1727204059.11718: getting the next task for host managed-node3 10202 1727204059.11730: done getting next task for host managed-node3 10202 1727204059.11733: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10202 1727204059.11736: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.11743: getting variables 10202 1727204059.11745: in VariableManager get_vars() 10202 1727204059.11900: Calling all_inventory to load vars for managed-node3 10202 1727204059.11903: Calling groups_inventory to load vars for managed-node3 10202 1727204059.11906: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.11981: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.11992: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.11999: done sending task result for task 127b8e07-fff9-0b04-2570-000000000241 10202 1727204059.12002: WORKER PROCESS EXITING 10202 1727204059.12007: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.13993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.16267: done with get_vars() 10202 1727204059.16286: done getting variables 10202 1727204059.16335: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204059.16432: variable 'interface' from source: task vars 10202 1727204059.16435: variable 'controller_device' from source: play vars 10202 1727204059.16481: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.478) 0:00:20.840 ***** 10202 1727204059.16510: entering _queue_task() for managed-node3/assert 10202 1727204059.16785: worker is 1 (out of 1 available) 10202 1727204059.16799: exiting _queue_task() for managed-node3/assert 10202 1727204059.16812: done queuing things up, now waiting for results queue to drain 10202 1727204059.16814: waiting for pending results... 10202 1727204059.17016: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' 10202 1727204059.17120: in run() - task 127b8e07-fff9-0b04-2570-00000000006f 10202 1727204059.17134: variable 'ansible_search_path' from source: unknown 10202 1727204059.17138: variable 'ansible_search_path' from source: unknown 10202 1727204059.17172: calling self._execute() 10202 1727204059.17249: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.17260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.17266: variable 'omit' from source: magic vars 10202 1727204059.17564: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.17578: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.17584: variable 'omit' from source: magic vars 10202 1727204059.17619: variable 'omit' from source: magic vars 10202 1727204059.17693: variable 'interface' from source: task vars 10202 1727204059.17699: variable 'controller_device' from source: play vars 10202 1727204059.17788: variable 'controller_device' from source: play vars 10202 1727204059.17791: variable 'omit' from source: magic vars 10202 1727204059.17841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204059.17902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204059.17905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204059.17939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.17942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.17984: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204059.17988: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.17991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.18278: Set connection var ansible_shell_type to sh 10202 1727204059.18282: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204059.18284: Set connection var ansible_connection to ssh 10202 1727204059.18286: Set connection var ansible_shell_executable to /bin/sh 10202 1727204059.18289: Set connection var ansible_pipelining to False 10202 1727204059.18291: Set connection var ansible_timeout to 10 10202 1727204059.18294: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.18296: variable 'ansible_connection' from source: unknown 10202 1727204059.18298: variable 'ansible_module_compression' from source: unknown 10202 1727204059.18300: variable 'ansible_shell_type' from source: unknown 10202 1727204059.18302: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.18304: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.18306: variable 'ansible_pipelining' from source: unknown 10202 1727204059.18309: variable 'ansible_timeout' from source: unknown 10202 1727204059.18311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.18327: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204059.18342: variable 'omit' from source: magic vars 10202 1727204059.18347: starting attempt loop 10202 1727204059.18349: running the handler 10202 1727204059.18509: variable 'interface_stat' from source: set_fact 10202 1727204059.18537: Evaluated conditional (interface_stat.stat.exists): True 10202 1727204059.18543: handler run complete 10202 1727204059.18567: attempt loop complete, returning result 10202 1727204059.18571: _execute() done 10202 1727204059.18575: dumping result to json 10202 1727204059.18581: done dumping result, returning 10202 1727204059.18590: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' [127b8e07-fff9-0b04-2570-00000000006f] 10202 1727204059.18596: sending task result for task 127b8e07-fff9-0b04-2570-00000000006f 10202 1727204059.18694: done sending task result for task 127b8e07-fff9-0b04-2570-00000000006f 10202 1727204059.18697: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204059.18758: no more pending results, returning what we have 10202 1727204059.18762: results queue empty 10202 1727204059.18763: checking for any_errors_fatal 10202 1727204059.18775: done checking for any_errors_fatal 10202 1727204059.18776: checking for max_fail_percentage 10202 1727204059.18777: done checking for max_fail_percentage 10202 1727204059.18779: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.18780: done checking to see if all hosts have failed 10202 1727204059.18780: getting the remaining hosts for this loop 10202 1727204059.18782: done getting the remaining hosts for this loop 10202 1727204059.18786: getting the next task for host managed-node3 10202 1727204059.18794: done getting next task for host managed-node3 10202 1727204059.18799: ^ task is: TASK: Include the task 'assert_profile_present.yml' 10202 1727204059.18801: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.18806: getting variables 10202 1727204059.18807: in VariableManager get_vars() 10202 1727204059.18855: Calling all_inventory to load vars for managed-node3 10202 1727204059.18858: Calling groups_inventory to load vars for managed-node3 10202 1727204059.18860: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.18916: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.18920: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.18924: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.20482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.21675: done with get_vars() 10202 1727204059.21702: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.052) 0:00:20.893 ***** 10202 1727204059.21781: entering _queue_task() for managed-node3/include_tasks 10202 1727204059.22073: worker is 1 (out of 1 available) 10202 1727204059.22089: exiting _queue_task() for managed-node3/include_tasks 10202 1727204059.22102: done queuing things up, now waiting for results queue to drain 10202 1727204059.22104: waiting for pending results... 10202 1727204059.22366: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 10202 1727204059.22521: in run() - task 127b8e07-fff9-0b04-2570-000000000070 10202 1727204059.22530: variable 'ansible_search_path' from source: unknown 10202 1727204059.22792: variable 'controller_profile' from source: play vars 10202 1727204059.22797: variable 'controller_profile' from source: play vars 10202 1727204059.22800: variable 'port1_profile' from source: play vars 10202 1727204059.22863: variable 'port1_profile' from source: play vars 10202 1727204059.22871: variable 'port2_profile' from source: play vars 10202 1727204059.22934: variable 'port2_profile' from source: play vars 10202 1727204059.22949: variable 'omit' from source: magic vars 10202 1727204059.23093: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.23106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.23117: variable 'omit' from source: magic vars 10202 1727204059.23386: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.23397: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.23486: variable 'item' from source: unknown 10202 1727204059.23497: variable 'item' from source: unknown 10202 1727204059.23802: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.23806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.23809: variable 'omit' from source: magic vars 10202 1727204059.23812: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.23814: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.23817: variable 'item' from source: unknown 10202 1727204059.23861: variable 'item' from source: unknown 10202 1727204059.23948: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.23951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.23956: variable 'omit' from source: magic vars 10202 1727204059.24065: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.24069: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.24106: variable 'item' from source: unknown 10202 1727204059.24145: variable 'item' from source: unknown 10202 1727204059.24215: dumping result to json 10202 1727204059.24220: done dumping result, returning 10202 1727204059.24224: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [127b8e07-fff9-0b04-2570-000000000070] 10202 1727204059.24226: sending task result for task 127b8e07-fff9-0b04-2570-000000000070 10202 1727204059.24269: done sending task result for task 127b8e07-fff9-0b04-2570-000000000070 10202 1727204059.24272: WORKER PROCESS EXITING 10202 1727204059.24300: no more pending results, returning what we have 10202 1727204059.24305: in VariableManager get_vars() 10202 1727204059.24354: Calling all_inventory to load vars for managed-node3 10202 1727204059.24357: Calling groups_inventory to load vars for managed-node3 10202 1727204059.24359: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.24381: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.24384: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.24388: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.25534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.26696: done with get_vars() 10202 1727204059.26718: variable 'ansible_search_path' from source: unknown 10202 1727204059.26736: variable 'ansible_search_path' from source: unknown 10202 1727204059.26743: variable 'ansible_search_path' from source: unknown 10202 1727204059.26748: we have included files to process 10202 1727204059.26748: generating all_blocks data 10202 1727204059.26749: done generating all_blocks data 10202 1727204059.26753: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.26753: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.26755: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.26908: in VariableManager get_vars() 10202 1727204059.26926: done with get_vars() 10202 1727204059.27123: done processing included file 10202 1727204059.27125: iterating over new_blocks loaded from include file 10202 1727204059.27126: in VariableManager get_vars() 10202 1727204059.27142: done with get_vars() 10202 1727204059.27144: filtering new block on tags 10202 1727204059.27158: done filtering new block on tags 10202 1727204059.27160: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0) 10202 1727204059.27164: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27164: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27169: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27240: in VariableManager get_vars() 10202 1727204059.27255: done with get_vars() 10202 1727204059.27420: done processing included file 10202 1727204059.27421: iterating over new_blocks loaded from include file 10202 1727204059.27422: in VariableManager get_vars() 10202 1727204059.27437: done with get_vars() 10202 1727204059.27438: filtering new block on tags 10202 1727204059.27451: done filtering new block on tags 10202 1727204059.27452: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.0) 10202 1727204059.27455: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27456: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27458: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10202 1727204059.27593: in VariableManager get_vars() 10202 1727204059.27608: done with get_vars() 10202 1727204059.27780: done processing included file 10202 1727204059.27782: iterating over new_blocks loaded from include file 10202 1727204059.27783: in VariableManager get_vars() 10202 1727204059.27794: done with get_vars() 10202 1727204059.27795: filtering new block on tags 10202 1727204059.27808: done filtering new block on tags 10202 1727204059.27809: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.1) 10202 1727204059.27812: extending task lists for all hosts with included blocks 10202 1727204059.29557: done extending task lists 10202 1727204059.29565: done processing included files 10202 1727204059.29567: results queue empty 10202 1727204059.29567: checking for any_errors_fatal 10202 1727204059.29570: done checking for any_errors_fatal 10202 1727204059.29571: checking for max_fail_percentage 10202 1727204059.29572: done checking for max_fail_percentage 10202 1727204059.29572: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.29573: done checking to see if all hosts have failed 10202 1727204059.29573: getting the remaining hosts for this loop 10202 1727204059.29574: done getting the remaining hosts for this loop 10202 1727204059.29576: getting the next task for host managed-node3 10202 1727204059.29579: done getting next task for host managed-node3 10202 1727204059.29581: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10202 1727204059.29583: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.29585: getting variables 10202 1727204059.29586: in VariableManager get_vars() 10202 1727204059.29601: Calling all_inventory to load vars for managed-node3 10202 1727204059.29603: Calling groups_inventory to load vars for managed-node3 10202 1727204059.29604: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.29611: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.29613: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.29615: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.30542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.31713: done with get_vars() 10202 1727204059.31741: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.100) 0:00:20.993 ***** 10202 1727204059.31809: entering _queue_task() for managed-node3/include_tasks 10202 1727204059.32102: worker is 1 (out of 1 available) 10202 1727204059.32116: exiting _queue_task() for managed-node3/include_tasks 10202 1727204059.32133: done queuing things up, now waiting for results queue to drain 10202 1727204059.32135: waiting for pending results... 10202 1727204059.32319: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10202 1727204059.32391: in run() - task 127b8e07-fff9-0b04-2570-00000000025f 10202 1727204059.32404: variable 'ansible_search_path' from source: unknown 10202 1727204059.32407: variable 'ansible_search_path' from source: unknown 10202 1727204059.32439: calling self._execute() 10202 1727204059.32521: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.32530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.32537: variable 'omit' from source: magic vars 10202 1727204059.32860: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.32872: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.32879: _execute() done 10202 1727204059.32882: dumping result to json 10202 1727204059.32885: done dumping result, returning 10202 1727204059.32892: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-0b04-2570-00000000025f] 10202 1727204059.32897: sending task result for task 127b8e07-fff9-0b04-2570-00000000025f 10202 1727204059.32996: done sending task result for task 127b8e07-fff9-0b04-2570-00000000025f 10202 1727204059.32999: WORKER PROCESS EXITING 10202 1727204059.33040: no more pending results, returning what we have 10202 1727204059.33045: in VariableManager get_vars() 10202 1727204059.33094: Calling all_inventory to load vars for managed-node3 10202 1727204059.33097: Calling groups_inventory to load vars for managed-node3 10202 1727204059.33099: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.33115: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.33118: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.33121: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.34117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.35303: done with get_vars() 10202 1727204059.35326: variable 'ansible_search_path' from source: unknown 10202 1727204059.35327: variable 'ansible_search_path' from source: unknown 10202 1727204059.35363: we have included files to process 10202 1727204059.35364: generating all_blocks data 10202 1727204059.35367: done generating all_blocks data 10202 1727204059.35368: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204059.35369: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204059.35371: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204059.36123: done processing included file 10202 1727204059.36126: iterating over new_blocks loaded from include file 10202 1727204059.36127: in VariableManager get_vars() 10202 1727204059.36145: done with get_vars() 10202 1727204059.36146: filtering new block on tags 10202 1727204059.36162: done filtering new block on tags 10202 1727204059.36164: in VariableManager get_vars() 10202 1727204059.36178: done with get_vars() 10202 1727204059.36179: filtering new block on tags 10202 1727204059.36192: done filtering new block on tags 10202 1727204059.36193: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10202 1727204059.36198: extending task lists for all hosts with included blocks 10202 1727204059.36319: done extending task lists 10202 1727204059.36320: done processing included files 10202 1727204059.36320: results queue empty 10202 1727204059.36321: checking for any_errors_fatal 10202 1727204059.36324: done checking for any_errors_fatal 10202 1727204059.36325: checking for max_fail_percentage 10202 1727204059.36326: done checking for max_fail_percentage 10202 1727204059.36326: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.36329: done checking to see if all hosts have failed 10202 1727204059.36329: getting the remaining hosts for this loop 10202 1727204059.36330: done getting the remaining hosts for this loop 10202 1727204059.36332: getting the next task for host managed-node3 10202 1727204059.36335: done getting next task for host managed-node3 10202 1727204059.36337: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204059.36339: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.36341: getting variables 10202 1727204059.36341: in VariableManager get_vars() 10202 1727204059.36491: Calling all_inventory to load vars for managed-node3 10202 1727204059.36493: Calling groups_inventory to load vars for managed-node3 10202 1727204059.36495: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.36500: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.36501: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.36503: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.37311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.38472: done with get_vars() 10202 1727204059.38502: done getting variables 10202 1727204059.38543: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.067) 0:00:21.061 ***** 10202 1727204059.38570: entering _queue_task() for managed-node3/set_fact 10202 1727204059.38862: worker is 1 (out of 1 available) 10202 1727204059.38879: exiting _queue_task() for managed-node3/set_fact 10202 1727204059.38894: done queuing things up, now waiting for results queue to drain 10202 1727204059.38896: waiting for pending results... 10202 1727204059.39093: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204059.39173: in run() - task 127b8e07-fff9-0b04-2570-0000000003b0 10202 1727204059.39186: variable 'ansible_search_path' from source: unknown 10202 1727204059.39189: variable 'ansible_search_path' from source: unknown 10202 1727204059.39227: calling self._execute() 10202 1727204059.39306: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.39310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.39321: variable 'omit' from source: magic vars 10202 1727204059.39635: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.39649: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.39653: variable 'omit' from source: magic vars 10202 1727204059.39698: variable 'omit' from source: magic vars 10202 1727204059.39725: variable 'omit' from source: magic vars 10202 1727204059.39768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204059.39800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204059.39816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204059.39833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.39843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.39873: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204059.39877: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.39880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.39954: Set connection var ansible_shell_type to sh 10202 1727204059.39957: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204059.39964: Set connection var ansible_connection to ssh 10202 1727204059.39972: Set connection var ansible_shell_executable to /bin/sh 10202 1727204059.39979: Set connection var ansible_pipelining to False 10202 1727204059.39982: Set connection var ansible_timeout to 10 10202 1727204059.40006: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.40010: variable 'ansible_connection' from source: unknown 10202 1727204059.40013: variable 'ansible_module_compression' from source: unknown 10202 1727204059.40016: variable 'ansible_shell_type' from source: unknown 10202 1727204059.40019: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.40021: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.40026: variable 'ansible_pipelining' from source: unknown 10202 1727204059.40032: variable 'ansible_timeout' from source: unknown 10202 1727204059.40034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.40148: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204059.40157: variable 'omit' from source: magic vars 10202 1727204059.40163: starting attempt loop 10202 1727204059.40167: running the handler 10202 1727204059.40179: handler run complete 10202 1727204059.40188: attempt loop complete, returning result 10202 1727204059.40192: _execute() done 10202 1727204059.40196: dumping result to json 10202 1727204059.40199: done dumping result, returning 10202 1727204059.40207: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-0b04-2570-0000000003b0] 10202 1727204059.40211: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b0 10202 1727204059.40300: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b0 10202 1727204059.40302: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10202 1727204059.40378: no more pending results, returning what we have 10202 1727204059.40382: results queue empty 10202 1727204059.40383: checking for any_errors_fatal 10202 1727204059.40385: done checking for any_errors_fatal 10202 1727204059.40385: checking for max_fail_percentage 10202 1727204059.40387: done checking for max_fail_percentage 10202 1727204059.40388: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.40389: done checking to see if all hosts have failed 10202 1727204059.40390: getting the remaining hosts for this loop 10202 1727204059.40392: done getting the remaining hosts for this loop 10202 1727204059.40396: getting the next task for host managed-node3 10202 1727204059.40403: done getting next task for host managed-node3 10202 1727204059.40406: ^ task is: TASK: Stat profile file 10202 1727204059.40411: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.40414: getting variables 10202 1727204059.40415: in VariableManager get_vars() 10202 1727204059.40464: Calling all_inventory to load vars for managed-node3 10202 1727204059.40473: Calling groups_inventory to load vars for managed-node3 10202 1727204059.40476: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.40486: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.40489: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.40491: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.41576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.42738: done with get_vars() 10202 1727204059.42762: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.042) 0:00:21.103 ***** 10202 1727204059.42844: entering _queue_task() for managed-node3/stat 10202 1727204059.43133: worker is 1 (out of 1 available) 10202 1727204059.43147: exiting _queue_task() for managed-node3/stat 10202 1727204059.43160: done queuing things up, now waiting for results queue to drain 10202 1727204059.43162: waiting for pending results... 10202 1727204059.43352: running TaskExecutor() for managed-node3/TASK: Stat profile file 10202 1727204059.43433: in run() - task 127b8e07-fff9-0b04-2570-0000000003b1 10202 1727204059.43445: variable 'ansible_search_path' from source: unknown 10202 1727204059.43448: variable 'ansible_search_path' from source: unknown 10202 1727204059.43482: calling self._execute() 10202 1727204059.43561: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.43568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.43577: variable 'omit' from source: magic vars 10202 1727204059.43889: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.43900: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.43907: variable 'omit' from source: magic vars 10202 1727204059.43948: variable 'omit' from source: magic vars 10202 1727204059.44025: variable 'profile' from source: include params 10202 1727204059.44031: variable 'item' from source: include params 10202 1727204059.44082: variable 'item' from source: include params 10202 1727204059.44098: variable 'omit' from source: magic vars 10202 1727204059.44136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204059.44171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204059.44188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204059.44202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.44213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.44239: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204059.44242: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.44245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.44321: Set connection var ansible_shell_type to sh 10202 1727204059.44329: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204059.44332: Set connection var ansible_connection to ssh 10202 1727204059.44338: Set connection var ansible_shell_executable to /bin/sh 10202 1727204059.44343: Set connection var ansible_pipelining to False 10202 1727204059.44349: Set connection var ansible_timeout to 10 10202 1727204059.44373: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.44383: variable 'ansible_connection' from source: unknown 10202 1727204059.44387: variable 'ansible_module_compression' from source: unknown 10202 1727204059.44389: variable 'ansible_shell_type' from source: unknown 10202 1727204059.44392: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.44394: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.44396: variable 'ansible_pipelining' from source: unknown 10202 1727204059.44399: variable 'ansible_timeout' from source: unknown 10202 1727204059.44401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.44569: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204059.44579: variable 'omit' from source: magic vars 10202 1727204059.44586: starting attempt loop 10202 1727204059.44589: running the handler 10202 1727204059.44604: _low_level_execute_command(): starting 10202 1727204059.44610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204059.45159: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.45164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.45171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.45223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.45229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.45234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.45311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.47188: stdout chunk (state=3): >>>/root <<< 10202 1727204059.47350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.47355: stderr chunk (state=3): >>><<< 10202 1727204059.47358: stdout chunk (state=3): >>><<< 10202 1727204059.47381: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.47394: _low_level_execute_command(): starting 10202 1727204059.47399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684 `" && echo ansible-tmp-1727204059.4738054-11518-16163863193684="` echo /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684 `" ) && sleep 0' 10202 1727204059.47904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.47908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.47918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.47920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.47974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.47982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.47984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.48060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.50239: stdout chunk (state=3): >>>ansible-tmp-1727204059.4738054-11518-16163863193684=/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684 <<< 10202 1727204059.50351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.50421: stderr chunk (state=3): >>><<< 10202 1727204059.50425: stdout chunk (state=3): >>><<< 10202 1727204059.50441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.4738054-11518-16163863193684=/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.50489: variable 'ansible_module_compression' from source: unknown 10202 1727204059.50539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204059.50580: variable 'ansible_facts' from source: unknown 10202 1727204059.50634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py 10202 1727204059.50748: Sending initial data 10202 1727204059.50751: Sent initial data (152 bytes) 10202 1727204059.51247: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.51251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.51253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204059.51258: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.51311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.51314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.51317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.51395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.53194: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204059.53254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204059.53326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpfmp1rrgj /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py <<< 10202 1727204059.53331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py" <<< 10202 1727204059.53396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpfmp1rrgj" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py" <<< 10202 1727204059.53398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py" <<< 10202 1727204059.54329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.54374: stderr chunk (state=3): >>><<< 10202 1727204059.54394: stdout chunk (state=3): >>><<< 10202 1727204059.54475: done transferring module to remote 10202 1727204059.54479: _low_level_execute_command(): starting 10202 1727204059.54482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/ /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py && sleep 0' 10202 1727204059.55250: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204059.55254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.55456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.55592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.57630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.57688: stderr chunk (state=3): >>><<< 10202 1727204059.57691: stdout chunk (state=3): >>><<< 10202 1727204059.57707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.57710: _low_level_execute_command(): starting 10202 1727204059.57716: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/AnsiballZ_stat.py && sleep 0' 10202 1727204059.58204: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.58208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.58213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204059.58216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.58272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.58276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.58287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.58364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.75905: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204059.77437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204059.77498: stderr chunk (state=3): >>><<< 10202 1727204059.77502: stdout chunk (state=3): >>><<< 10202 1727204059.77522: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204059.77553: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204059.77563: _low_level_execute_command(): starting 10202 1727204059.77569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.4738054-11518-16163863193684/ > /dev/null 2>&1 && sleep 0' 10202 1727204059.78074: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.78078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.78081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.78083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.78142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.78145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.78152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.78224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.80299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.80359: stderr chunk (state=3): >>><<< 10202 1727204059.80362: stdout chunk (state=3): >>><<< 10202 1727204059.80378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.80385: handler run complete 10202 1727204059.80404: attempt loop complete, returning result 10202 1727204059.80407: _execute() done 10202 1727204059.80411: dumping result to json 10202 1727204059.80416: done dumping result, returning 10202 1727204059.80424: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-0b04-2570-0000000003b1] 10202 1727204059.80432: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b1 10202 1727204059.80541: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b1 10202 1727204059.80544: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10202 1727204059.80606: no more pending results, returning what we have 10202 1727204059.80610: results queue empty 10202 1727204059.80611: checking for any_errors_fatal 10202 1727204059.80616: done checking for any_errors_fatal 10202 1727204059.80617: checking for max_fail_percentage 10202 1727204059.80618: done checking for max_fail_percentage 10202 1727204059.80620: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.80621: done checking to see if all hosts have failed 10202 1727204059.80621: getting the remaining hosts for this loop 10202 1727204059.80623: done getting the remaining hosts for this loop 10202 1727204059.80630: getting the next task for host managed-node3 10202 1727204059.80637: done getting next task for host managed-node3 10202 1727204059.80640: ^ task is: TASK: Set NM profile exist flag based on the profile files 10202 1727204059.80645: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.80649: getting variables 10202 1727204059.80651: in VariableManager get_vars() 10202 1727204059.80701: Calling all_inventory to load vars for managed-node3 10202 1727204059.80704: Calling groups_inventory to load vars for managed-node3 10202 1727204059.80706: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.80718: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.80721: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.80723: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.81733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.83006: done with get_vars() 10202 1727204059.83027: done getting variables 10202 1727204059.83084: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.402) 0:00:21.506 ***** 10202 1727204059.83111: entering _queue_task() for managed-node3/set_fact 10202 1727204059.83398: worker is 1 (out of 1 available) 10202 1727204059.83415: exiting _queue_task() for managed-node3/set_fact 10202 1727204059.83428: done queuing things up, now waiting for results queue to drain 10202 1727204059.83429: waiting for pending results... 10202 1727204059.83628: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10202 1727204059.83715: in run() - task 127b8e07-fff9-0b04-2570-0000000003b2 10202 1727204059.83728: variable 'ansible_search_path' from source: unknown 10202 1727204059.83732: variable 'ansible_search_path' from source: unknown 10202 1727204059.83776: calling self._execute() 10202 1727204059.83851: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.83858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.83868: variable 'omit' from source: magic vars 10202 1727204059.84185: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.84197: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.84293: variable 'profile_stat' from source: set_fact 10202 1727204059.84308: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204059.84311: when evaluation is False, skipping this task 10202 1727204059.84314: _execute() done 10202 1727204059.84317: dumping result to json 10202 1727204059.84321: done dumping result, returning 10202 1727204059.84332: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-0b04-2570-0000000003b2] 10202 1727204059.84335: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b2 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204059.84490: no more pending results, returning what we have 10202 1727204059.84494: results queue empty 10202 1727204059.84495: checking for any_errors_fatal 10202 1727204059.84507: done checking for any_errors_fatal 10202 1727204059.84508: checking for max_fail_percentage 10202 1727204059.84509: done checking for max_fail_percentage 10202 1727204059.84510: checking to see if all hosts have failed and the running result is not ok 10202 1727204059.84511: done checking to see if all hosts have failed 10202 1727204059.84512: getting the remaining hosts for this loop 10202 1727204059.84514: done getting the remaining hosts for this loop 10202 1727204059.84518: getting the next task for host managed-node3 10202 1727204059.84525: done getting next task for host managed-node3 10202 1727204059.84528: ^ task is: TASK: Get NM profile info 10202 1727204059.84533: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204059.84537: getting variables 10202 1727204059.84539: in VariableManager get_vars() 10202 1727204059.84592: Calling all_inventory to load vars for managed-node3 10202 1727204059.84595: Calling groups_inventory to load vars for managed-node3 10202 1727204059.84597: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204059.84603: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b2 10202 1727204059.84606: WORKER PROCESS EXITING 10202 1727204059.84617: Calling all_plugins_play to load vars for managed-node3 10202 1727204059.84620: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204059.84623: Calling groups_plugins_play to load vars for managed-node3 10202 1727204059.85615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204059.86764: done with get_vars() 10202 1727204059.86792: done getting variables 10202 1727204059.86843: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.037) 0:00:21.544 ***** 10202 1727204059.86875: entering _queue_task() for managed-node3/shell 10202 1727204059.87159: worker is 1 (out of 1 available) 10202 1727204059.87177: exiting _queue_task() for managed-node3/shell 10202 1727204059.87190: done queuing things up, now waiting for results queue to drain 10202 1727204059.87191: waiting for pending results... 10202 1727204059.87388: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10202 1727204059.87475: in run() - task 127b8e07-fff9-0b04-2570-0000000003b3 10202 1727204059.87488: variable 'ansible_search_path' from source: unknown 10202 1727204059.87491: variable 'ansible_search_path' from source: unknown 10202 1727204059.87525: calling self._execute() 10202 1727204059.87608: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.87613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.87623: variable 'omit' from source: magic vars 10202 1727204059.87931: variable 'ansible_distribution_major_version' from source: facts 10202 1727204059.87944: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204059.87951: variable 'omit' from source: magic vars 10202 1727204059.87991: variable 'omit' from source: magic vars 10202 1727204059.88074: variable 'profile' from source: include params 10202 1727204059.88080: variable 'item' from source: include params 10202 1727204059.88129: variable 'item' from source: include params 10202 1727204059.88146: variable 'omit' from source: magic vars 10202 1727204059.88187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204059.88221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204059.88240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204059.88255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.88267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204059.88295: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204059.88299: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.88303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.88380: Set connection var ansible_shell_type to sh 10202 1727204059.88386: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204059.88392: Set connection var ansible_connection to ssh 10202 1727204059.88399: Set connection var ansible_shell_executable to /bin/sh 10202 1727204059.88403: Set connection var ansible_pipelining to False 10202 1727204059.88409: Set connection var ansible_timeout to 10 10202 1727204059.88436: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.88440: variable 'ansible_connection' from source: unknown 10202 1727204059.88443: variable 'ansible_module_compression' from source: unknown 10202 1727204059.88445: variable 'ansible_shell_type' from source: unknown 10202 1727204059.88448: variable 'ansible_shell_executable' from source: unknown 10202 1727204059.88450: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204059.88453: variable 'ansible_pipelining' from source: unknown 10202 1727204059.88456: variable 'ansible_timeout' from source: unknown 10202 1727204059.88461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204059.88580: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204059.88590: variable 'omit' from source: magic vars 10202 1727204059.88595: starting attempt loop 10202 1727204059.88598: running the handler 10202 1727204059.88607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204059.88627: _low_level_execute_command(): starting 10202 1727204059.88637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204059.89214: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.89219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.89224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.89273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.89290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.89301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.89368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.91222: stdout chunk (state=3): >>>/root <<< 10202 1727204059.91322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.91389: stderr chunk (state=3): >>><<< 10202 1727204059.91394: stdout chunk (state=3): >>><<< 10202 1727204059.91416: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.91435: _low_level_execute_command(): starting 10202 1727204059.91444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559 `" && echo ansible-tmp-1727204059.9141598-11534-145874973824559="` echo /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559 `" ) && sleep 0' 10202 1727204059.91953: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.91967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.91972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204059.91974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.92023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.92026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.92032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.92108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.94268: stdout chunk (state=3): >>>ansible-tmp-1727204059.9141598-11534-145874973824559=/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559 <<< 10202 1727204059.94380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.94445: stderr chunk (state=3): >>><<< 10202 1727204059.94450: stdout chunk (state=3): >>><<< 10202 1727204059.94467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.9141598-11534-145874973824559=/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204059.94501: variable 'ansible_module_compression' from source: unknown 10202 1727204059.94544: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204059.94582: variable 'ansible_facts' from source: unknown 10202 1727204059.94632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py 10202 1727204059.94748: Sending initial data 10202 1727204059.94751: Sent initial data (156 bytes) 10202 1727204059.95258: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204059.95261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.95264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.95270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.95324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.95332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.95335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.95403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204059.97207: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204059.97276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204059.97339: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpreua1d2k /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py <<< 10202 1727204059.97346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py" <<< 10202 1727204059.97407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpreua1d2k" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py" <<< 10202 1727204059.97411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py" <<< 10202 1727204059.98072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204059.98149: stderr chunk (state=3): >>><<< 10202 1727204059.98153: stdout chunk (state=3): >>><<< 10202 1727204059.98177: done transferring module to remote 10202 1727204059.98189: _low_level_execute_command(): starting 10202 1727204059.98193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/ /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py && sleep 0' 10202 1727204059.98672: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204059.98679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.98703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204059.98751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204059.98754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204059.98756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204059.98834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204060.06013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204060.06075: stderr chunk (state=3): >>><<< 10202 1727204060.06080: stdout chunk (state=3): >>><<< 10202 1727204060.06094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204060.06097: _low_level_execute_command(): starting 10202 1727204060.06102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/AnsiballZ_command.py && sleep 0' 10202 1727204060.06595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.06599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.06601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.06604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.06655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204060.06659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204060.06661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204060.06743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204060.27356: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:20.248226", "end": "2024-09-24 14:54:20.272199", "delta": "0:00:00.023973", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204060.29131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204060.29197: stderr chunk (state=3): >>><<< 10202 1727204060.29201: stdout chunk (state=3): >>><<< 10202 1727204060.29217: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:20.248226", "end": "2024-09-24 14:54:20.272199", "delta": "0:00:00.023973", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204060.29252: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204060.29263: _low_level_execute_command(): starting 10202 1727204060.29266: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.9141598-11534-145874973824559/ > /dev/null 2>&1 && sleep 0' 10202 1727204060.29771: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.29775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.29777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.29780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204060.29782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.29844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204060.29847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204060.29849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204060.29913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204060.32009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204060.32066: stderr chunk (state=3): >>><<< 10202 1727204060.32070: stdout chunk (state=3): >>><<< 10202 1727204060.32085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204060.32095: handler run complete 10202 1727204060.32116: Evaluated conditional (False): False 10202 1727204060.32125: attempt loop complete, returning result 10202 1727204060.32131: _execute() done 10202 1727204060.32134: dumping result to json 10202 1727204060.32136: done dumping result, returning 10202 1727204060.32147: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-0b04-2570-0000000003b3] 10202 1727204060.32152: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b3 10202 1727204060.32264: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b3 10202 1727204060.32269: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023973", "end": "2024-09-24 14:54:20.272199", "rc": 0, "start": "2024-09-24 14:54:20.248226" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10202 1727204060.32352: no more pending results, returning what we have 10202 1727204060.32355: results queue empty 10202 1727204060.32356: checking for any_errors_fatal 10202 1727204060.32363: done checking for any_errors_fatal 10202 1727204060.32364: checking for max_fail_percentage 10202 1727204060.32367: done checking for max_fail_percentage 10202 1727204060.32368: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.32369: done checking to see if all hosts have failed 10202 1727204060.32370: getting the remaining hosts for this loop 10202 1727204060.32372: done getting the remaining hosts for this loop 10202 1727204060.32376: getting the next task for host managed-node3 10202 1727204060.32389: done getting next task for host managed-node3 10202 1727204060.32392: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204060.32396: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.32399: getting variables 10202 1727204060.32401: in VariableManager get_vars() 10202 1727204060.32442: Calling all_inventory to load vars for managed-node3 10202 1727204060.32445: Calling groups_inventory to load vars for managed-node3 10202 1727204060.32447: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.32458: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.32461: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.32463: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.33575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.34738: done with get_vars() 10202 1727204060.34768: done getting variables 10202 1727204060.34820: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.479) 0:00:22.023 ***** 10202 1727204060.34847: entering _queue_task() for managed-node3/set_fact 10202 1727204060.35144: worker is 1 (out of 1 available) 10202 1727204060.35159: exiting _queue_task() for managed-node3/set_fact 10202 1727204060.35174: done queuing things up, now waiting for results queue to drain 10202 1727204060.35176: waiting for pending results... 10202 1727204060.35358: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204060.35448: in run() - task 127b8e07-fff9-0b04-2570-0000000003b4 10202 1727204060.35461: variable 'ansible_search_path' from source: unknown 10202 1727204060.35464: variable 'ansible_search_path' from source: unknown 10202 1727204060.35498: calling self._execute() 10202 1727204060.35576: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.35582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.35592: variable 'omit' from source: magic vars 10202 1727204060.35902: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.35913: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.36014: variable 'nm_profile_exists' from source: set_fact 10202 1727204060.36032: Evaluated conditional (nm_profile_exists.rc == 0): True 10202 1727204060.36035: variable 'omit' from source: magic vars 10202 1727204060.36078: variable 'omit' from source: magic vars 10202 1727204060.36109: variable 'omit' from source: magic vars 10202 1727204060.36146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.36179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.36196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.36211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.36223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.36249: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.36253: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.36255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.36386: Set connection var ansible_shell_type to sh 10202 1727204060.36391: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.36394: Set connection var ansible_connection to ssh 10202 1727204060.36396: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.36399: Set connection var ansible_pipelining to False 10202 1727204060.36401: Set connection var ansible_timeout to 10 10202 1727204060.36404: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.36406: variable 'ansible_connection' from source: unknown 10202 1727204060.36408: variable 'ansible_module_compression' from source: unknown 10202 1727204060.36411: variable 'ansible_shell_type' from source: unknown 10202 1727204060.36414: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.36416: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.36418: variable 'ansible_pipelining' from source: unknown 10202 1727204060.36420: variable 'ansible_timeout' from source: unknown 10202 1727204060.36423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.36505: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204060.36514: variable 'omit' from source: magic vars 10202 1727204060.36520: starting attempt loop 10202 1727204060.36523: running the handler 10202 1727204060.36538: handler run complete 10202 1727204060.36545: attempt loop complete, returning result 10202 1727204060.36548: _execute() done 10202 1727204060.36550: dumping result to json 10202 1727204060.36556: done dumping result, returning 10202 1727204060.36563: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-0b04-2570-0000000003b4] 10202 1727204060.36571: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b4 10202 1727204060.36658: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b4 10202 1727204060.36661: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10202 1727204060.36720: no more pending results, returning what we have 10202 1727204060.36724: results queue empty 10202 1727204060.36725: checking for any_errors_fatal 10202 1727204060.36737: done checking for any_errors_fatal 10202 1727204060.36738: checking for max_fail_percentage 10202 1727204060.36740: done checking for max_fail_percentage 10202 1727204060.36741: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.36742: done checking to see if all hosts have failed 10202 1727204060.36743: getting the remaining hosts for this loop 10202 1727204060.36745: done getting the remaining hosts for this loop 10202 1727204060.36749: getting the next task for host managed-node3 10202 1727204060.36758: done getting next task for host managed-node3 10202 1727204060.36760: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204060.36772: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.36777: getting variables 10202 1727204060.36779: in VariableManager get_vars() 10202 1727204060.36820: Calling all_inventory to load vars for managed-node3 10202 1727204060.36823: Calling groups_inventory to load vars for managed-node3 10202 1727204060.36825: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.36838: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.36841: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.36844: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.37852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.39144: done with get_vars() 10202 1727204060.39168: done getting variables 10202 1727204060.39220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.39319: variable 'profile' from source: include params 10202 1727204060.39322: variable 'item' from source: include params 10202 1727204060.39369: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.045) 0:00:22.069 ***** 10202 1727204060.39403: entering _queue_task() for managed-node3/command 10202 1727204060.39697: worker is 1 (out of 1 available) 10202 1727204060.39712: exiting _queue_task() for managed-node3/command 10202 1727204060.39725: done queuing things up, now waiting for results queue to drain 10202 1727204060.39727: waiting for pending results... 10202 1727204060.39925: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 10202 1727204060.40013: in run() - task 127b8e07-fff9-0b04-2570-0000000003b6 10202 1727204060.40029: variable 'ansible_search_path' from source: unknown 10202 1727204060.40034: variable 'ansible_search_path' from source: unknown 10202 1727204060.40064: calling self._execute() 10202 1727204060.40145: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.40151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.40160: variable 'omit' from source: magic vars 10202 1727204060.40477: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.40487: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.40589: variable 'profile_stat' from source: set_fact 10202 1727204060.40602: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204060.40605: when evaluation is False, skipping this task 10202 1727204060.40608: _execute() done 10202 1727204060.40611: dumping result to json 10202 1727204060.40615: done dumping result, returning 10202 1727204060.40627: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [127b8e07-fff9-0b04-2570-0000000003b6] 10202 1727204060.40632: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b6 10202 1727204060.40729: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b6 10202 1727204060.40733: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204060.40787: no more pending results, returning what we have 10202 1727204060.40790: results queue empty 10202 1727204060.40791: checking for any_errors_fatal 10202 1727204060.40800: done checking for any_errors_fatal 10202 1727204060.40801: checking for max_fail_percentage 10202 1727204060.40803: done checking for max_fail_percentage 10202 1727204060.40804: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.40805: done checking to see if all hosts have failed 10202 1727204060.40805: getting the remaining hosts for this loop 10202 1727204060.40807: done getting the remaining hosts for this loop 10202 1727204060.40813: getting the next task for host managed-node3 10202 1727204060.40820: done getting next task for host managed-node3 10202 1727204060.40823: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204060.40829: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.40834: getting variables 10202 1727204060.40836: in VariableManager get_vars() 10202 1727204060.40890: Calling all_inventory to load vars for managed-node3 10202 1727204060.40893: Calling groups_inventory to load vars for managed-node3 10202 1727204060.40895: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.40907: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.40910: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.40913: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.41931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.43109: done with get_vars() 10202 1727204060.43142: done getting variables 10202 1727204060.43199: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.43294: variable 'profile' from source: include params 10202 1727204060.43297: variable 'item' from source: include params 10202 1727204060.43343: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.039) 0:00:22.109 ***** 10202 1727204060.43371: entering _queue_task() for managed-node3/set_fact 10202 1727204060.43670: worker is 1 (out of 1 available) 10202 1727204060.43683: exiting _queue_task() for managed-node3/set_fact 10202 1727204060.43696: done queuing things up, now waiting for results queue to drain 10202 1727204060.43697: waiting for pending results... 10202 1727204060.43887: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 10202 1727204060.43972: in run() - task 127b8e07-fff9-0b04-2570-0000000003b7 10202 1727204060.43986: variable 'ansible_search_path' from source: unknown 10202 1727204060.43989: variable 'ansible_search_path' from source: unknown 10202 1727204060.44022: calling self._execute() 10202 1727204060.44109: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.44115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.44127: variable 'omit' from source: magic vars 10202 1727204060.44437: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.44448: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.44549: variable 'profile_stat' from source: set_fact 10202 1727204060.44562: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204060.44566: when evaluation is False, skipping this task 10202 1727204060.44569: _execute() done 10202 1727204060.44579: dumping result to json 10202 1727204060.44586: done dumping result, returning 10202 1727204060.44597: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [127b8e07-fff9-0b04-2570-0000000003b7] 10202 1727204060.44600: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b7 10202 1727204060.44698: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b7 10202 1727204060.44702: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204060.44751: no more pending results, returning what we have 10202 1727204060.44754: results queue empty 10202 1727204060.44756: checking for any_errors_fatal 10202 1727204060.44761: done checking for any_errors_fatal 10202 1727204060.44762: checking for max_fail_percentage 10202 1727204060.44764: done checking for max_fail_percentage 10202 1727204060.44767: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.44768: done checking to see if all hosts have failed 10202 1727204060.44768: getting the remaining hosts for this loop 10202 1727204060.44770: done getting the remaining hosts for this loop 10202 1727204060.44775: getting the next task for host managed-node3 10202 1727204060.44782: done getting next task for host managed-node3 10202 1727204060.44785: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10202 1727204060.44789: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.44794: getting variables 10202 1727204060.44796: in VariableManager get_vars() 10202 1727204060.44842: Calling all_inventory to load vars for managed-node3 10202 1727204060.44845: Calling groups_inventory to load vars for managed-node3 10202 1727204060.44847: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.44860: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.44863: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.44872: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.45995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.47132: done with get_vars() 10202 1727204060.47160: done getting variables 10202 1727204060.47213: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.47305: variable 'profile' from source: include params 10202 1727204060.47308: variable 'item' from source: include params 10202 1727204060.47350: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.040) 0:00:22.149 ***** 10202 1727204060.47378: entering _queue_task() for managed-node3/command 10202 1727204060.47663: worker is 1 (out of 1 available) 10202 1727204060.47680: exiting _queue_task() for managed-node3/command 10202 1727204060.47693: done queuing things up, now waiting for results queue to drain 10202 1727204060.47694: waiting for pending results... 10202 1727204060.47893: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 10202 1727204060.47982: in run() - task 127b8e07-fff9-0b04-2570-0000000003b8 10202 1727204060.47993: variable 'ansible_search_path' from source: unknown 10202 1727204060.47997: variable 'ansible_search_path' from source: unknown 10202 1727204060.48036: calling self._execute() 10202 1727204060.48114: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.48121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.48134: variable 'omit' from source: magic vars 10202 1727204060.48446: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.48457: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.48552: variable 'profile_stat' from source: set_fact 10202 1727204060.48564: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204060.48570: when evaluation is False, skipping this task 10202 1727204060.48572: _execute() done 10202 1727204060.48575: dumping result to json 10202 1727204060.48579: done dumping result, returning 10202 1727204060.48590: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 [127b8e07-fff9-0b04-2570-0000000003b8] 10202 1727204060.48593: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b8 10202 1727204060.48687: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b8 10202 1727204060.48692: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204060.48748: no more pending results, returning what we have 10202 1727204060.48752: results queue empty 10202 1727204060.48753: checking for any_errors_fatal 10202 1727204060.48759: done checking for any_errors_fatal 10202 1727204060.48760: checking for max_fail_percentage 10202 1727204060.48762: done checking for max_fail_percentage 10202 1727204060.48763: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.48764: done checking to see if all hosts have failed 10202 1727204060.48764: getting the remaining hosts for this loop 10202 1727204060.48768: done getting the remaining hosts for this loop 10202 1727204060.48772: getting the next task for host managed-node3 10202 1727204060.48779: done getting next task for host managed-node3 10202 1727204060.48781: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10202 1727204060.48786: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.48790: getting variables 10202 1727204060.48792: in VariableManager get_vars() 10202 1727204060.48839: Calling all_inventory to load vars for managed-node3 10202 1727204060.48842: Calling groups_inventory to load vars for managed-node3 10202 1727204060.48844: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.48856: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.48859: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.48861: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.49873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.51138: done with get_vars() 10202 1727204060.51162: done getting variables 10202 1727204060.51216: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.51309: variable 'profile' from source: include params 10202 1727204060.51312: variable 'item' from source: include params 10202 1727204060.51355: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.040) 0:00:22.189 ***** 10202 1727204060.51384: entering _queue_task() for managed-node3/set_fact 10202 1727204060.51673: worker is 1 (out of 1 available) 10202 1727204060.51689: exiting _queue_task() for managed-node3/set_fact 10202 1727204060.51700: done queuing things up, now waiting for results queue to drain 10202 1727204060.51701: waiting for pending results... 10202 1727204060.51896: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 10202 1727204060.51987: in run() - task 127b8e07-fff9-0b04-2570-0000000003b9 10202 1727204060.51999: variable 'ansible_search_path' from source: unknown 10202 1727204060.52002: variable 'ansible_search_path' from source: unknown 10202 1727204060.52044: calling self._execute() 10202 1727204060.52120: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.52126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.52137: variable 'omit' from source: magic vars 10202 1727204060.52445: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.52456: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.52550: variable 'profile_stat' from source: set_fact 10202 1727204060.52564: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204060.52570: when evaluation is False, skipping this task 10202 1727204060.52573: _execute() done 10202 1727204060.52576: dumping result to json 10202 1727204060.52579: done dumping result, returning 10202 1727204060.52590: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [127b8e07-fff9-0b04-2570-0000000003b9] 10202 1727204060.52593: sending task result for task 127b8e07-fff9-0b04-2570-0000000003b9 10202 1727204060.52683: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003b9 10202 1727204060.52686: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204060.52742: no more pending results, returning what we have 10202 1727204060.52746: results queue empty 10202 1727204060.52747: checking for any_errors_fatal 10202 1727204060.52752: done checking for any_errors_fatal 10202 1727204060.52753: checking for max_fail_percentage 10202 1727204060.52755: done checking for max_fail_percentage 10202 1727204060.52756: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.52757: done checking to see if all hosts have failed 10202 1727204060.52758: getting the remaining hosts for this loop 10202 1727204060.52760: done getting the remaining hosts for this loop 10202 1727204060.52764: getting the next task for host managed-node3 10202 1727204060.52774: done getting next task for host managed-node3 10202 1727204060.52777: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10202 1727204060.52781: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.52785: getting variables 10202 1727204060.52786: in VariableManager get_vars() 10202 1727204060.52834: Calling all_inventory to load vars for managed-node3 10202 1727204060.52837: Calling groups_inventory to load vars for managed-node3 10202 1727204060.52839: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.52850: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.52853: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.52855: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.53844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.55003: done with get_vars() 10202 1727204060.55033: done getting variables 10202 1727204060.55087: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.55184: variable 'profile' from source: include params 10202 1727204060.55187: variable 'item' from source: include params 10202 1727204060.55231: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.038) 0:00:22.227 ***** 10202 1727204060.55256: entering _queue_task() for managed-node3/assert 10202 1727204060.55545: worker is 1 (out of 1 available) 10202 1727204060.55562: exiting _queue_task() for managed-node3/assert 10202 1727204060.55577: done queuing things up, now waiting for results queue to drain 10202 1727204060.55578: waiting for pending results... 10202 1727204060.55776: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' 10202 1727204060.55862: in run() - task 127b8e07-fff9-0b04-2570-000000000260 10202 1727204060.55877: variable 'ansible_search_path' from source: unknown 10202 1727204060.55880: variable 'ansible_search_path' from source: unknown 10202 1727204060.55917: calling self._execute() 10202 1727204060.55992: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.55998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.56007: variable 'omit' from source: magic vars 10202 1727204060.56314: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.56324: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.56334: variable 'omit' from source: magic vars 10202 1727204060.56371: variable 'omit' from source: magic vars 10202 1727204060.56450: variable 'profile' from source: include params 10202 1727204060.56455: variable 'item' from source: include params 10202 1727204060.56505: variable 'item' from source: include params 10202 1727204060.56521: variable 'omit' from source: magic vars 10202 1727204060.56559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.56592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.56610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.56625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.56638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.56664: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.56668: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.56671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.56749: Set connection var ansible_shell_type to sh 10202 1727204060.56754: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.56760: Set connection var ansible_connection to ssh 10202 1727204060.56767: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.56773: Set connection var ansible_pipelining to False 10202 1727204060.56778: Set connection var ansible_timeout to 10 10202 1727204060.56802: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.56805: variable 'ansible_connection' from source: unknown 10202 1727204060.56808: variable 'ansible_module_compression' from source: unknown 10202 1727204060.56811: variable 'ansible_shell_type' from source: unknown 10202 1727204060.56813: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.56816: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.56821: variable 'ansible_pipelining' from source: unknown 10202 1727204060.56824: variable 'ansible_timeout' from source: unknown 10202 1727204060.56828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.56949: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204060.56959: variable 'omit' from source: magic vars 10202 1727204060.56967: starting attempt loop 10202 1727204060.56971: running the handler 10202 1727204060.57063: variable 'lsr_net_profile_exists' from source: set_fact 10202 1727204060.57069: Evaluated conditional (lsr_net_profile_exists): True 10202 1727204060.57076: handler run complete 10202 1727204060.57089: attempt loop complete, returning result 10202 1727204060.57092: _execute() done 10202 1727204060.57094: dumping result to json 10202 1727204060.57099: done dumping result, returning 10202 1727204060.57106: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' [127b8e07-fff9-0b04-2570-000000000260] 10202 1727204060.57113: sending task result for task 127b8e07-fff9-0b04-2570-000000000260 10202 1727204060.57208: done sending task result for task 127b8e07-fff9-0b04-2570-000000000260 10202 1727204060.57212: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204060.57282: no more pending results, returning what we have 10202 1727204060.57286: results queue empty 10202 1727204060.57287: checking for any_errors_fatal 10202 1727204060.57292: done checking for any_errors_fatal 10202 1727204060.57292: checking for max_fail_percentage 10202 1727204060.57294: done checking for max_fail_percentage 10202 1727204060.57296: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.57297: done checking to see if all hosts have failed 10202 1727204060.57297: getting the remaining hosts for this loop 10202 1727204060.57299: done getting the remaining hosts for this loop 10202 1727204060.57304: getting the next task for host managed-node3 10202 1727204060.57310: done getting next task for host managed-node3 10202 1727204060.57313: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10202 1727204060.57316: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.57321: getting variables 10202 1727204060.57322: in VariableManager get_vars() 10202 1727204060.57367: Calling all_inventory to load vars for managed-node3 10202 1727204060.57370: Calling groups_inventory to load vars for managed-node3 10202 1727204060.57372: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.57411: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.57414: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.57417: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.64309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.65474: done with get_vars() 10202 1727204060.65506: done getting variables 10202 1727204060.65550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.65670: variable 'profile' from source: include params 10202 1727204060.65673: variable 'item' from source: include params 10202 1727204060.65736: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.105) 0:00:22.333 ***** 10202 1727204060.65773: entering _queue_task() for managed-node3/assert 10202 1727204060.66154: worker is 1 (out of 1 available) 10202 1727204060.66172: exiting _queue_task() for managed-node3/assert 10202 1727204060.66185: done queuing things up, now waiting for results queue to drain 10202 1727204060.66187: waiting for pending results... 10202 1727204060.66453: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' 10202 1727204060.66659: in run() - task 127b8e07-fff9-0b04-2570-000000000261 10202 1727204060.66664: variable 'ansible_search_path' from source: unknown 10202 1727204060.66670: variable 'ansible_search_path' from source: unknown 10202 1727204060.66769: calling self._execute() 10202 1727204060.66843: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.66858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.66882: variable 'omit' from source: magic vars 10202 1727204060.67319: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.67340: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.67353: variable 'omit' from source: magic vars 10202 1727204060.67422: variable 'omit' from source: magic vars 10202 1727204060.67541: variable 'profile' from source: include params 10202 1727204060.67770: variable 'item' from source: include params 10202 1727204060.67774: variable 'item' from source: include params 10202 1727204060.67777: variable 'omit' from source: magic vars 10202 1727204060.67780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.67783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.67785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.67787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.67807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.67846: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.67855: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.67863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.67985: Set connection var ansible_shell_type to sh 10202 1727204060.68000: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.68019: Set connection var ansible_connection to ssh 10202 1727204060.68031: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.68042: Set connection var ansible_pipelining to False 10202 1727204060.68052: Set connection var ansible_timeout to 10 10202 1727204060.68086: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.68095: variable 'ansible_connection' from source: unknown 10202 1727204060.68103: variable 'ansible_module_compression' from source: unknown 10202 1727204060.68110: variable 'ansible_shell_type' from source: unknown 10202 1727204060.68125: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.68133: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.68143: variable 'ansible_pipelining' from source: unknown 10202 1727204060.68178: variable 'ansible_timeout' from source: unknown 10202 1727204060.68182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.68313: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204060.68323: variable 'omit' from source: magic vars 10202 1727204060.68330: starting attempt loop 10202 1727204060.68334: running the handler 10202 1727204060.68431: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10202 1727204060.68435: Evaluated conditional (lsr_net_profile_ansible_managed): True 10202 1727204060.68441: handler run complete 10202 1727204060.68455: attempt loop complete, returning result 10202 1727204060.68458: _execute() done 10202 1727204060.68460: dumping result to json 10202 1727204060.68464: done dumping result, returning 10202 1727204060.68473: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' [127b8e07-fff9-0b04-2570-000000000261] 10202 1727204060.68483: sending task result for task 127b8e07-fff9-0b04-2570-000000000261 10202 1727204060.68572: done sending task result for task 127b8e07-fff9-0b04-2570-000000000261 10202 1727204060.68575: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204060.68639: no more pending results, returning what we have 10202 1727204060.68642: results queue empty 10202 1727204060.68643: checking for any_errors_fatal 10202 1727204060.68650: done checking for any_errors_fatal 10202 1727204060.68651: checking for max_fail_percentage 10202 1727204060.68652: done checking for max_fail_percentage 10202 1727204060.68653: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.68654: done checking to see if all hosts have failed 10202 1727204060.68655: getting the remaining hosts for this loop 10202 1727204060.68657: done getting the remaining hosts for this loop 10202 1727204060.68662: getting the next task for host managed-node3 10202 1727204060.68669: done getting next task for host managed-node3 10202 1727204060.68672: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10202 1727204060.68675: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.68679: getting variables 10202 1727204060.68681: in VariableManager get_vars() 10202 1727204060.68725: Calling all_inventory to load vars for managed-node3 10202 1727204060.68730: Calling groups_inventory to load vars for managed-node3 10202 1727204060.68733: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.68743: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.68746: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.68748: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.69741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.71681: done with get_vars() 10202 1727204060.71716: done getting variables 10202 1727204060.71787: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204060.71931: variable 'profile' from source: include params 10202 1727204060.71935: variable 'item' from source: include params 10202 1727204060.72020: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.062) 0:00:22.395 ***** 10202 1727204060.72062: entering _queue_task() for managed-node3/assert 10202 1727204060.72449: worker is 1 (out of 1 available) 10202 1727204060.72463: exiting _queue_task() for managed-node3/assert 10202 1727204060.72479: done queuing things up, now waiting for results queue to drain 10202 1727204060.72480: waiting for pending results... 10202 1727204060.72888: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 10202 1727204060.72894: in run() - task 127b8e07-fff9-0b04-2570-000000000262 10202 1727204060.72912: variable 'ansible_search_path' from source: unknown 10202 1727204060.72916: variable 'ansible_search_path' from source: unknown 10202 1727204060.72955: calling self._execute() 10202 1727204060.73206: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.73210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.73213: variable 'omit' from source: magic vars 10202 1727204060.73505: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.73517: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.73534: variable 'omit' from source: magic vars 10202 1727204060.73579: variable 'omit' from source: magic vars 10202 1727204060.73700: variable 'profile' from source: include params 10202 1727204060.73710: variable 'item' from source: include params 10202 1727204060.73779: variable 'item' from source: include params 10202 1727204060.73799: variable 'omit' from source: magic vars 10202 1727204060.73849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.73891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.73915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.73939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.73951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.73989: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.73992: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.73995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.74170: Set connection var ansible_shell_type to sh 10202 1727204060.74174: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.74176: Set connection var ansible_connection to ssh 10202 1727204060.74179: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.74181: Set connection var ansible_pipelining to False 10202 1727204060.74184: Set connection var ansible_timeout to 10 10202 1727204060.74186: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.74188: variable 'ansible_connection' from source: unknown 10202 1727204060.74190: variable 'ansible_module_compression' from source: unknown 10202 1727204060.74192: variable 'ansible_shell_type' from source: unknown 10202 1727204060.74195: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.74197: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.74199: variable 'ansible_pipelining' from source: unknown 10202 1727204060.74202: variable 'ansible_timeout' from source: unknown 10202 1727204060.74204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.74336: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204060.74348: variable 'omit' from source: magic vars 10202 1727204060.74353: starting attempt loop 10202 1727204060.74356: running the handler 10202 1727204060.74671: variable 'lsr_net_profile_fingerprint' from source: set_fact 10202 1727204060.74674: Evaluated conditional (lsr_net_profile_fingerprint): True 10202 1727204060.74676: handler run complete 10202 1727204060.74678: attempt loop complete, returning result 10202 1727204060.74680: _execute() done 10202 1727204060.74682: dumping result to json 10202 1727204060.74684: done dumping result, returning 10202 1727204060.74686: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 [127b8e07-fff9-0b04-2570-000000000262] 10202 1727204060.74688: sending task result for task 127b8e07-fff9-0b04-2570-000000000262 10202 1727204060.74755: done sending task result for task 127b8e07-fff9-0b04-2570-000000000262 10202 1727204060.74758: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204060.74805: no more pending results, returning what we have 10202 1727204060.74808: results queue empty 10202 1727204060.74809: checking for any_errors_fatal 10202 1727204060.74814: done checking for any_errors_fatal 10202 1727204060.74815: checking for max_fail_percentage 10202 1727204060.74817: done checking for max_fail_percentage 10202 1727204060.74818: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.74819: done checking to see if all hosts have failed 10202 1727204060.74820: getting the remaining hosts for this loop 10202 1727204060.74821: done getting the remaining hosts for this loop 10202 1727204060.74824: getting the next task for host managed-node3 10202 1727204060.74832: done getting next task for host managed-node3 10202 1727204060.74835: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10202 1727204060.74838: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.74842: getting variables 10202 1727204060.74843: in VariableManager get_vars() 10202 1727204060.74884: Calling all_inventory to load vars for managed-node3 10202 1727204060.74887: Calling groups_inventory to load vars for managed-node3 10202 1727204060.74889: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.74899: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.74902: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.74905: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.76735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.78964: done with get_vars() 10202 1727204060.79009: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.070) 0:00:22.466 ***** 10202 1727204060.79130: entering _queue_task() for managed-node3/include_tasks 10202 1727204060.79549: worker is 1 (out of 1 available) 10202 1727204060.79571: exiting _queue_task() for managed-node3/include_tasks 10202 1727204060.79585: done queuing things up, now waiting for results queue to drain 10202 1727204060.79587: waiting for pending results... 10202 1727204060.79969: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10202 1727204060.79982: in run() - task 127b8e07-fff9-0b04-2570-000000000266 10202 1727204060.79997: variable 'ansible_search_path' from source: unknown 10202 1727204060.80000: variable 'ansible_search_path' from source: unknown 10202 1727204060.80046: calling self._execute() 10202 1727204060.80160: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.80171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.80184: variable 'omit' from source: magic vars 10202 1727204060.80708: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.80712: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.80715: _execute() done 10202 1727204060.80718: dumping result to json 10202 1727204060.80720: done dumping result, returning 10202 1727204060.80723: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-0b04-2570-000000000266] 10202 1727204060.80725: sending task result for task 127b8e07-fff9-0b04-2570-000000000266 10202 1727204060.80843: no more pending results, returning what we have 10202 1727204060.80848: in VariableManager get_vars() 10202 1727204060.80901: Calling all_inventory to load vars for managed-node3 10202 1727204060.80905: Calling groups_inventory to load vars for managed-node3 10202 1727204060.80907: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.80928: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.80931: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.80936: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.81499: done sending task result for task 127b8e07-fff9-0b04-2570-000000000266 10202 1727204060.81504: WORKER PROCESS EXITING 10202 1727204060.82820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.84015: done with get_vars() 10202 1727204060.84042: variable 'ansible_search_path' from source: unknown 10202 1727204060.84043: variable 'ansible_search_path' from source: unknown 10202 1727204060.84080: we have included files to process 10202 1727204060.84081: generating all_blocks data 10202 1727204060.84083: done generating all_blocks data 10202 1727204060.84087: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204060.84088: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204060.84090: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204060.85110: done processing included file 10202 1727204060.85113: iterating over new_blocks loaded from include file 10202 1727204060.85115: in VariableManager get_vars() 10202 1727204060.85147: done with get_vars() 10202 1727204060.85150: filtering new block on tags 10202 1727204060.85181: done filtering new block on tags 10202 1727204060.85184: in VariableManager get_vars() 10202 1727204060.85206: done with get_vars() 10202 1727204060.85208: filtering new block on tags 10202 1727204060.85242: done filtering new block on tags 10202 1727204060.85245: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10202 1727204060.85252: extending task lists for all hosts with included blocks 10202 1727204060.85437: done extending task lists 10202 1727204060.85438: done processing included files 10202 1727204060.85439: results queue empty 10202 1727204060.85440: checking for any_errors_fatal 10202 1727204060.85442: done checking for any_errors_fatal 10202 1727204060.85443: checking for max_fail_percentage 10202 1727204060.85443: done checking for max_fail_percentage 10202 1727204060.85444: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.85445: done checking to see if all hosts have failed 10202 1727204060.85445: getting the remaining hosts for this loop 10202 1727204060.85446: done getting the remaining hosts for this loop 10202 1727204060.85448: getting the next task for host managed-node3 10202 1727204060.85453: done getting next task for host managed-node3 10202 1727204060.85456: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204060.85458: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.85460: getting variables 10202 1727204060.85461: in VariableManager get_vars() 10202 1727204060.85475: Calling all_inventory to load vars for managed-node3 10202 1727204060.85477: Calling groups_inventory to load vars for managed-node3 10202 1727204060.85479: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.85484: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.85485: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.85487: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.86384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.87759: done with get_vars() 10202 1727204060.87789: done getting variables 10202 1727204060.87843: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.087) 0:00:22.554 ***** 10202 1727204060.87876: entering _queue_task() for managed-node3/set_fact 10202 1727204060.88251: worker is 1 (out of 1 available) 10202 1727204060.88269: exiting _queue_task() for managed-node3/set_fact 10202 1727204060.88285: done queuing things up, now waiting for results queue to drain 10202 1727204060.88286: waiting for pending results... 10202 1727204060.88603: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204060.88779: in run() - task 127b8e07-fff9-0b04-2570-0000000003f8 10202 1727204060.88783: variable 'ansible_search_path' from source: unknown 10202 1727204060.88872: variable 'ansible_search_path' from source: unknown 10202 1727204060.88878: calling self._execute() 10202 1727204060.88955: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.89002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.89062: variable 'omit' from source: magic vars 10202 1727204060.89436: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.89443: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.89446: variable 'omit' from source: magic vars 10202 1727204060.89486: variable 'omit' from source: magic vars 10202 1727204060.89517: variable 'omit' from source: magic vars 10202 1727204060.89561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.89591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.89608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.89624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.89635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.89665: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.89668: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.89671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.89745: Set connection var ansible_shell_type to sh 10202 1727204060.89750: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.89761: Set connection var ansible_connection to ssh 10202 1727204060.89764: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.89769: Set connection var ansible_pipelining to False 10202 1727204060.89775: Set connection var ansible_timeout to 10 10202 1727204060.89795: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.89798: variable 'ansible_connection' from source: unknown 10202 1727204060.89802: variable 'ansible_module_compression' from source: unknown 10202 1727204060.89804: variable 'ansible_shell_type' from source: unknown 10202 1727204060.89806: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.89809: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.89813: variable 'ansible_pipelining' from source: unknown 10202 1727204060.89816: variable 'ansible_timeout' from source: unknown 10202 1727204060.89820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.89936: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204060.89946: variable 'omit' from source: magic vars 10202 1727204060.89951: starting attempt loop 10202 1727204060.89954: running the handler 10202 1727204060.89967: handler run complete 10202 1727204060.89975: attempt loop complete, returning result 10202 1727204060.89985: _execute() done 10202 1727204060.89988: dumping result to json 10202 1727204060.89990: done dumping result, returning 10202 1727204060.89993: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-0b04-2570-0000000003f8] 10202 1727204060.90001: sending task result for task 127b8e07-fff9-0b04-2570-0000000003f8 10202 1727204060.90093: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003f8 10202 1727204060.90096: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10202 1727204060.90163: no more pending results, returning what we have 10202 1727204060.90168: results queue empty 10202 1727204060.90170: checking for any_errors_fatal 10202 1727204060.90172: done checking for any_errors_fatal 10202 1727204060.90173: checking for max_fail_percentage 10202 1727204060.90175: done checking for max_fail_percentage 10202 1727204060.90176: checking to see if all hosts have failed and the running result is not ok 10202 1727204060.90177: done checking to see if all hosts have failed 10202 1727204060.90177: getting the remaining hosts for this loop 10202 1727204060.90179: done getting the remaining hosts for this loop 10202 1727204060.90183: getting the next task for host managed-node3 10202 1727204060.90190: done getting next task for host managed-node3 10202 1727204060.90192: ^ task is: TASK: Stat profile file 10202 1727204060.90196: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204060.90200: getting variables 10202 1727204060.90202: in VariableManager get_vars() 10202 1727204060.90247: Calling all_inventory to load vars for managed-node3 10202 1727204060.90250: Calling groups_inventory to load vars for managed-node3 10202 1727204060.90252: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204060.90263: Calling all_plugins_play to load vars for managed-node3 10202 1727204060.90273: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204060.90277: Calling groups_plugins_play to load vars for managed-node3 10202 1727204060.91644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204060.93591: done with get_vars() 10202 1727204060.93631: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.058) 0:00:22.612 ***** 10202 1727204060.93741: entering _queue_task() for managed-node3/stat 10202 1727204060.94072: worker is 1 (out of 1 available) 10202 1727204060.94087: exiting _queue_task() for managed-node3/stat 10202 1727204060.94101: done queuing things up, now waiting for results queue to drain 10202 1727204060.94103: waiting for pending results... 10202 1727204060.94299: running TaskExecutor() for managed-node3/TASK: Stat profile file 10202 1727204060.94383: in run() - task 127b8e07-fff9-0b04-2570-0000000003f9 10202 1727204060.94397: variable 'ansible_search_path' from source: unknown 10202 1727204060.94401: variable 'ansible_search_path' from source: unknown 10202 1727204060.94440: calling self._execute() 10202 1727204060.94522: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.94525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.94541: variable 'omit' from source: magic vars 10202 1727204060.94858: variable 'ansible_distribution_major_version' from source: facts 10202 1727204060.94873: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204060.94878: variable 'omit' from source: magic vars 10202 1727204060.94911: variable 'omit' from source: magic vars 10202 1727204060.94993: variable 'profile' from source: include params 10202 1727204060.94997: variable 'item' from source: include params 10202 1727204060.95049: variable 'item' from source: include params 10202 1727204060.95064: variable 'omit' from source: magic vars 10202 1727204060.95107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204060.95141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204060.95158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204060.95174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.95185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204060.95214: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204060.95218: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.95221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.95303: Set connection var ansible_shell_type to sh 10202 1727204060.95308: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204060.95311: Set connection var ansible_connection to ssh 10202 1727204060.95320: Set connection var ansible_shell_executable to /bin/sh 10202 1727204060.95322: Set connection var ansible_pipelining to False 10202 1727204060.95328: Set connection var ansible_timeout to 10 10202 1727204060.95351: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.95354: variable 'ansible_connection' from source: unknown 10202 1727204060.95357: variable 'ansible_module_compression' from source: unknown 10202 1727204060.95360: variable 'ansible_shell_type' from source: unknown 10202 1727204060.95362: variable 'ansible_shell_executable' from source: unknown 10202 1727204060.95364: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204060.95370: variable 'ansible_pipelining' from source: unknown 10202 1727204060.95373: variable 'ansible_timeout' from source: unknown 10202 1727204060.95378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204060.95547: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204060.95556: variable 'omit' from source: magic vars 10202 1727204060.95561: starting attempt loop 10202 1727204060.95564: running the handler 10202 1727204060.95579: _low_level_execute_command(): starting 10202 1727204060.95586: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204060.96361: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204060.96383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204060.96398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.96417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204060.96491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.96547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204060.96564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204060.96612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204060.96749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204060.98602: stdout chunk (state=3): >>>/root <<< 10202 1727204060.98706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204060.98777: stderr chunk (state=3): >>><<< 10202 1727204060.98781: stdout chunk (state=3): >>><<< 10202 1727204060.98806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204060.98820: _low_level_execute_command(): starting 10202 1727204060.98830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486 `" && echo ansible-tmp-1727204060.988055-11566-59392354454486="` echo /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486 `" ) && sleep 0' 10202 1727204060.99349: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.99353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.99364: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204060.99369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204060.99372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204060.99420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204060.99424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204060.99426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204060.99507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.01671: stdout chunk (state=3): >>>ansible-tmp-1727204060.988055-11566-59392354454486=/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486 <<< 10202 1727204061.01776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.01836: stderr chunk (state=3): >>><<< 10202 1727204061.01840: stdout chunk (state=3): >>><<< 10202 1727204061.01859: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204060.988055-11566-59392354454486=/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.01906: variable 'ansible_module_compression' from source: unknown 10202 1727204061.01956: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204061.01995: variable 'ansible_facts' from source: unknown 10202 1727204061.02047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py 10202 1727204061.02162: Sending initial data 10202 1727204061.02167: Sent initial data (151 bytes) 10202 1727204061.02682: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.02686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.02689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204061.02691: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.02693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.02743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.02760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.02830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.04637: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204061.04702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204061.04770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpuucsqzyv /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py <<< 10202 1727204061.04773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py" <<< 10202 1727204061.04834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpuucsqzyv" to remote "/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py" <<< 10202 1727204061.04841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py" <<< 10202 1727204061.05515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.05595: stderr chunk (state=3): >>><<< 10202 1727204061.05599: stdout chunk (state=3): >>><<< 10202 1727204061.05621: done transferring module to remote 10202 1727204061.05634: _low_level_execute_command(): starting 10202 1727204061.05639: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/ /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py && sleep 0' 10202 1727204061.06144: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.06148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.06151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.06158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.06213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.06216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.06219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.06291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.08316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.08380: stderr chunk (state=3): >>><<< 10202 1727204061.08384: stdout chunk (state=3): >>><<< 10202 1727204061.08399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.08403: _low_level_execute_command(): starting 10202 1727204061.08408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/AnsiballZ_stat.py && sleep 0' 10202 1727204061.08927: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.08933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.08936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.08938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.08995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.08999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.09003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.09083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.26875: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204061.28400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204061.28468: stderr chunk (state=3): >>><<< 10202 1727204061.28474: stdout chunk (state=3): >>><<< 10202 1727204061.28486: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204061.28514: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204061.28527: _low_level_execute_command(): starting 10202 1727204061.28534: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204060.988055-11566-59392354454486/ > /dev/null 2>&1 && sleep 0' 10202 1727204061.29035: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.29039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.29042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.29050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.29098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.29102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.29111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.29201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.31251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.31311: stderr chunk (state=3): >>><<< 10202 1727204061.31316: stdout chunk (state=3): >>><<< 10202 1727204061.31330: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.31339: handler run complete 10202 1727204061.31356: attempt loop complete, returning result 10202 1727204061.31359: _execute() done 10202 1727204061.31362: dumping result to json 10202 1727204061.31368: done dumping result, returning 10202 1727204061.31377: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-0b04-2570-0000000003f9] 10202 1727204061.31386: sending task result for task 127b8e07-fff9-0b04-2570-0000000003f9 10202 1727204061.31491: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003f9 10202 1727204061.31494: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10202 1727204061.31555: no more pending results, returning what we have 10202 1727204061.31558: results queue empty 10202 1727204061.31559: checking for any_errors_fatal 10202 1727204061.31570: done checking for any_errors_fatal 10202 1727204061.31571: checking for max_fail_percentage 10202 1727204061.31572: done checking for max_fail_percentage 10202 1727204061.31573: checking to see if all hosts have failed and the running result is not ok 10202 1727204061.31574: done checking to see if all hosts have failed 10202 1727204061.31575: getting the remaining hosts for this loop 10202 1727204061.31577: done getting the remaining hosts for this loop 10202 1727204061.31581: getting the next task for host managed-node3 10202 1727204061.31588: done getting next task for host managed-node3 10202 1727204061.31591: ^ task is: TASK: Set NM profile exist flag based on the profile files 10202 1727204061.31595: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204061.31599: getting variables 10202 1727204061.31601: in VariableManager get_vars() 10202 1727204061.31648: Calling all_inventory to load vars for managed-node3 10202 1727204061.31651: Calling groups_inventory to load vars for managed-node3 10202 1727204061.31653: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204061.31664: Calling all_plugins_play to load vars for managed-node3 10202 1727204061.31675: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204061.31678: Calling groups_plugins_play to load vars for managed-node3 10202 1727204061.32820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204061.33985: done with get_vars() 10202 1727204061.34013: done getting variables 10202 1727204061.34065: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.403) 0:00:23.016 ***** 10202 1727204061.34092: entering _queue_task() for managed-node3/set_fact 10202 1727204061.34373: worker is 1 (out of 1 available) 10202 1727204061.34388: exiting _queue_task() for managed-node3/set_fact 10202 1727204061.34403: done queuing things up, now waiting for results queue to drain 10202 1727204061.34404: waiting for pending results... 10202 1727204061.34607: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10202 1727204061.34702: in run() - task 127b8e07-fff9-0b04-2570-0000000003fa 10202 1727204061.34716: variable 'ansible_search_path' from source: unknown 10202 1727204061.34720: variable 'ansible_search_path' from source: unknown 10202 1727204061.34758: calling self._execute() 10202 1727204061.34838: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.34844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.34861: variable 'omit' from source: magic vars 10202 1727204061.35171: variable 'ansible_distribution_major_version' from source: facts 10202 1727204061.35183: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204061.35278: variable 'profile_stat' from source: set_fact 10202 1727204061.35298: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204061.35302: when evaluation is False, skipping this task 10202 1727204061.35304: _execute() done 10202 1727204061.35307: dumping result to json 10202 1727204061.35309: done dumping result, returning 10202 1727204061.35318: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-0b04-2570-0000000003fa] 10202 1727204061.35321: sending task result for task 127b8e07-fff9-0b04-2570-0000000003fa 10202 1727204061.35415: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003fa 10202 1727204061.35420: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204061.35474: no more pending results, returning what we have 10202 1727204061.35478: results queue empty 10202 1727204061.35479: checking for any_errors_fatal 10202 1727204061.35491: done checking for any_errors_fatal 10202 1727204061.35492: checking for max_fail_percentage 10202 1727204061.35493: done checking for max_fail_percentage 10202 1727204061.35494: checking to see if all hosts have failed and the running result is not ok 10202 1727204061.35495: done checking to see if all hosts have failed 10202 1727204061.35496: getting the remaining hosts for this loop 10202 1727204061.35498: done getting the remaining hosts for this loop 10202 1727204061.35502: getting the next task for host managed-node3 10202 1727204061.35509: done getting next task for host managed-node3 10202 1727204061.35513: ^ task is: TASK: Get NM profile info 10202 1727204061.35519: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204061.35523: getting variables 10202 1727204061.35525: in VariableManager get_vars() 10202 1727204061.35571: Calling all_inventory to load vars for managed-node3 10202 1727204061.35574: Calling groups_inventory to load vars for managed-node3 10202 1727204061.35576: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204061.35588: Calling all_plugins_play to load vars for managed-node3 10202 1727204061.35591: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204061.35593: Calling groups_plugins_play to load vars for managed-node3 10202 1727204061.36593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204061.38073: done with get_vars() 10202 1727204061.38100: done getting variables 10202 1727204061.38163: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.040) 0:00:23.057 ***** 10202 1727204061.38193: entering _queue_task() for managed-node3/shell 10202 1727204061.38494: worker is 1 (out of 1 available) 10202 1727204061.38510: exiting _queue_task() for managed-node3/shell 10202 1727204061.38524: done queuing things up, now waiting for results queue to drain 10202 1727204061.38525: waiting for pending results... 10202 1727204061.38749: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10202 1727204061.38836: in run() - task 127b8e07-fff9-0b04-2570-0000000003fb 10202 1727204061.38849: variable 'ansible_search_path' from source: unknown 10202 1727204061.38854: variable 'ansible_search_path' from source: unknown 10202 1727204061.38889: calling self._execute() 10202 1727204061.38970: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.38979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.38988: variable 'omit' from source: magic vars 10202 1727204061.39305: variable 'ansible_distribution_major_version' from source: facts 10202 1727204061.39314: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204061.39321: variable 'omit' from source: magic vars 10202 1727204061.39362: variable 'omit' from source: magic vars 10202 1727204061.39443: variable 'profile' from source: include params 10202 1727204061.39447: variable 'item' from source: include params 10202 1727204061.39500: variable 'item' from source: include params 10202 1727204061.39517: variable 'omit' from source: magic vars 10202 1727204061.39560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204061.39593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204061.39611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204061.39627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204061.39643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204061.39668: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204061.39671: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.39674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.39755: Set connection var ansible_shell_type to sh 10202 1727204061.39758: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204061.39764: Set connection var ansible_connection to ssh 10202 1727204061.39771: Set connection var ansible_shell_executable to /bin/sh 10202 1727204061.39777: Set connection var ansible_pipelining to False 10202 1727204061.39783: Set connection var ansible_timeout to 10 10202 1727204061.39804: variable 'ansible_shell_executable' from source: unknown 10202 1727204061.39807: variable 'ansible_connection' from source: unknown 10202 1727204061.39809: variable 'ansible_module_compression' from source: unknown 10202 1727204061.39812: variable 'ansible_shell_type' from source: unknown 10202 1727204061.39814: variable 'ansible_shell_executable' from source: unknown 10202 1727204061.39816: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.39819: variable 'ansible_pipelining' from source: unknown 10202 1727204061.39821: variable 'ansible_timeout' from source: unknown 10202 1727204061.39827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.39944: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204061.39956: variable 'omit' from source: magic vars 10202 1727204061.39959: starting attempt loop 10202 1727204061.39961: running the handler 10202 1727204061.39974: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204061.39991: _low_level_execute_command(): starting 10202 1727204061.39998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204061.40573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.40577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.40581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.40583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.40639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.40642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.40644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.40732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.42581: stdout chunk (state=3): >>>/root <<< 10202 1727204061.42687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.42748: stderr chunk (state=3): >>><<< 10202 1727204061.42751: stdout chunk (state=3): >>><<< 10202 1727204061.42777: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.42789: _low_level_execute_command(): starting 10202 1727204061.42799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339 `" && echo ansible-tmp-1727204061.4277675-11576-266691909519339="` echo /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339 `" ) && sleep 0' 10202 1727204061.43306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.43310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.43313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.43315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204061.43317: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.43381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.43384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.43386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.43454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.45599: stdout chunk (state=3): >>>ansible-tmp-1727204061.4277675-11576-266691909519339=/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339 <<< 10202 1727204061.45713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.45776: stderr chunk (state=3): >>><<< 10202 1727204061.45780: stdout chunk (state=3): >>><<< 10202 1727204061.45799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204061.4277675-11576-266691909519339=/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.45833: variable 'ansible_module_compression' from source: unknown 10202 1727204061.45879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204061.45916: variable 'ansible_facts' from source: unknown 10202 1727204061.45968: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py 10202 1727204061.46083: Sending initial data 10202 1727204061.46086: Sent initial data (156 bytes) 10202 1727204061.46585: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204061.46588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.46591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.46595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.46646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.46650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.46732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.48504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204061.48562: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204061.48634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3mq16400 /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py <<< 10202 1727204061.48640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py" <<< 10202 1727204061.48697: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3mq16400" to remote "/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py" <<< 10202 1727204061.48700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py" <<< 10202 1727204061.49350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.49430: stderr chunk (state=3): >>><<< 10202 1727204061.49433: stdout chunk (state=3): >>><<< 10202 1727204061.49451: done transferring module to remote 10202 1727204061.49462: _low_level_execute_command(): starting 10202 1727204061.49468: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/ /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py && sleep 0' 10202 1727204061.49972: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204061.49975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204061.49983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 10202 1727204061.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204061.49988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.50041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.50045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.50049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.50123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.52164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.52216: stderr chunk (state=3): >>><<< 10202 1727204061.52219: stdout chunk (state=3): >>><<< 10202 1727204061.52237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.52240: _low_level_execute_command(): starting 10202 1727204061.52250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/AnsiballZ_command.py && sleep 0' 10202 1727204061.52745: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204061.52749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.52752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204061.52811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.52814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.52817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.52902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.73104: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:21.704688", "end": "2024-09-24 14:54:21.728716", "delta": "0:00:00.024028", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204061.74897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204061.75322: stderr chunk (state=3): >>><<< 10202 1727204061.75326: stdout chunk (state=3): >>><<< 10202 1727204061.75329: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:21.704688", "end": "2024-09-24 14:54:21.728716", "delta": "0:00:00.024028", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204061.75333: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204061.75341: _low_level_execute_command(): starting 10202 1727204061.75343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204061.4277675-11576-266691909519339/ > /dev/null 2>&1 && sleep 0' 10202 1727204061.76324: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204061.76693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204061.76711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204061.76727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204061.76833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204061.78870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204061.78980: stderr chunk (state=3): >>><<< 10202 1727204061.78997: stdout chunk (state=3): >>><<< 10202 1727204061.79018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204061.79031: handler run complete 10202 1727204061.79060: Evaluated conditional (False): False 10202 1727204061.79085: attempt loop complete, returning result 10202 1727204061.79098: _execute() done 10202 1727204061.79106: dumping result to json 10202 1727204061.79115: done dumping result, returning 10202 1727204061.79127: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-0b04-2570-0000000003fb] 10202 1727204061.79136: sending task result for task 127b8e07-fff9-0b04-2570-0000000003fb ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.024028", "end": "2024-09-24 14:54:21.728716", "rc": 0, "start": "2024-09-24 14:54:21.704688" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 10202 1727204061.79336: no more pending results, returning what we have 10202 1727204061.79340: results queue empty 10202 1727204061.79341: checking for any_errors_fatal 10202 1727204061.79348: done checking for any_errors_fatal 10202 1727204061.79349: checking for max_fail_percentage 10202 1727204061.79350: done checking for max_fail_percentage 10202 1727204061.79351: checking to see if all hosts have failed and the running result is not ok 10202 1727204061.79352: done checking to see if all hosts have failed 10202 1727204061.79353: getting the remaining hosts for this loop 10202 1727204061.79355: done getting the remaining hosts for this loop 10202 1727204061.79359: getting the next task for host managed-node3 10202 1727204061.79367: done getting next task for host managed-node3 10202 1727204061.79370: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204061.79374: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204061.79378: getting variables 10202 1727204061.79380: in VariableManager get_vars() 10202 1727204061.79421: Calling all_inventory to load vars for managed-node3 10202 1727204061.79424: Calling groups_inventory to load vars for managed-node3 10202 1727204061.79426: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204061.79442: Calling all_plugins_play to load vars for managed-node3 10202 1727204061.79445: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204061.79449: Calling groups_plugins_play to load vars for managed-node3 10202 1727204061.79989: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003fb 10202 1727204061.79994: WORKER PROCESS EXITING 10202 1727204061.81736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204061.84011: done with get_vars() 10202 1727204061.84052: done getting variables 10202 1727204061.84121: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.459) 0:00:23.516 ***** 10202 1727204061.84157: entering _queue_task() for managed-node3/set_fact 10202 1727204061.84536: worker is 1 (out of 1 available) 10202 1727204061.84549: exiting _queue_task() for managed-node3/set_fact 10202 1727204061.84770: done queuing things up, now waiting for results queue to drain 10202 1727204061.84772: waiting for pending results... 10202 1727204061.84904: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204061.85108: in run() - task 127b8e07-fff9-0b04-2570-0000000003fc 10202 1727204061.85112: variable 'ansible_search_path' from source: unknown 10202 1727204061.85116: variable 'ansible_search_path' from source: unknown 10202 1727204061.85119: calling self._execute() 10202 1727204061.85217: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.85231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.85247: variable 'omit' from source: magic vars 10202 1727204061.85656: variable 'ansible_distribution_major_version' from source: facts 10202 1727204061.85677: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204061.85823: variable 'nm_profile_exists' from source: set_fact 10202 1727204061.85845: Evaluated conditional (nm_profile_exists.rc == 0): True 10202 1727204061.85855: variable 'omit' from source: magic vars 10202 1727204061.86171: variable 'omit' from source: magic vars 10202 1727204061.86175: variable 'omit' from source: magic vars 10202 1727204061.86290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204061.86294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204061.86297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204061.86310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204061.86415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204061.86454: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204061.86464: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.86475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.86706: Set connection var ansible_shell_type to sh 10202 1727204061.86720: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204061.87073: Set connection var ansible_connection to ssh 10202 1727204061.87076: Set connection var ansible_shell_executable to /bin/sh 10202 1727204061.87078: Set connection var ansible_pipelining to False 10202 1727204061.87081: Set connection var ansible_timeout to 10 10202 1727204061.87083: variable 'ansible_shell_executable' from source: unknown 10202 1727204061.87085: variable 'ansible_connection' from source: unknown 10202 1727204061.87087: variable 'ansible_module_compression' from source: unknown 10202 1727204061.87089: variable 'ansible_shell_type' from source: unknown 10202 1727204061.87092: variable 'ansible_shell_executable' from source: unknown 10202 1727204061.87093: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.87096: variable 'ansible_pipelining' from source: unknown 10202 1727204061.87098: variable 'ansible_timeout' from source: unknown 10202 1727204061.87101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.87250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204061.87307: variable 'omit' from source: magic vars 10202 1727204061.87317: starting attempt loop 10202 1727204061.87508: running the handler 10202 1727204061.87512: handler run complete 10202 1727204061.87514: attempt loop complete, returning result 10202 1727204061.87516: _execute() done 10202 1727204061.87519: dumping result to json 10202 1727204061.87521: done dumping result, returning 10202 1727204061.87523: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-0b04-2570-0000000003fc] 10202 1727204061.87526: sending task result for task 127b8e07-fff9-0b04-2570-0000000003fc 10202 1727204061.87603: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003fc 10202 1727204061.87607: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10202 1727204061.87672: no more pending results, returning what we have 10202 1727204061.87675: results queue empty 10202 1727204061.87676: checking for any_errors_fatal 10202 1727204061.87686: done checking for any_errors_fatal 10202 1727204061.87686: checking for max_fail_percentage 10202 1727204061.87688: done checking for max_fail_percentage 10202 1727204061.87689: checking to see if all hosts have failed and the running result is not ok 10202 1727204061.87691: done checking to see if all hosts have failed 10202 1727204061.87691: getting the remaining hosts for this loop 10202 1727204061.87694: done getting the remaining hosts for this loop 10202 1727204061.87698: getting the next task for host managed-node3 10202 1727204061.87709: done getting next task for host managed-node3 10202 1727204061.87712: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204061.87717: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204061.87722: getting variables 10202 1727204061.87724: in VariableManager get_vars() 10202 1727204061.87772: Calling all_inventory to load vars for managed-node3 10202 1727204061.87775: Calling groups_inventory to load vars for managed-node3 10202 1727204061.87777: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204061.87791: Calling all_plugins_play to load vars for managed-node3 10202 1727204061.87794: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204061.87797: Calling groups_plugins_play to load vars for managed-node3 10202 1727204061.91876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204061.96683: done with get_vars() 10202 1727204061.96725: done getting variables 10202 1727204061.96916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204061.97130: variable 'profile' from source: include params 10202 1727204061.97134: variable 'item' from source: include params 10202 1727204061.97203: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.132) 0:00:23.648 ***** 10202 1727204061.97362: entering _queue_task() for managed-node3/command 10202 1727204061.98147: worker is 1 (out of 1 available) 10202 1727204061.98161: exiting _queue_task() for managed-node3/command 10202 1727204061.98179: done queuing things up, now waiting for results queue to drain 10202 1727204061.98180: waiting for pending results... 10202 1727204061.98787: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 10202 1727204061.99174: in run() - task 127b8e07-fff9-0b04-2570-0000000003fe 10202 1727204061.99181: variable 'ansible_search_path' from source: unknown 10202 1727204061.99185: variable 'ansible_search_path' from source: unknown 10202 1727204061.99189: calling self._execute() 10202 1727204061.99193: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204061.99196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204061.99200: variable 'omit' from source: magic vars 10202 1727204062.00189: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.00212: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.00356: variable 'profile_stat' from source: set_fact 10202 1727204062.00772: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204062.00776: when evaluation is False, skipping this task 10202 1727204062.00779: _execute() done 10202 1727204062.00782: dumping result to json 10202 1727204062.00784: done dumping result, returning 10202 1727204062.00786: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [127b8e07-fff9-0b04-2570-0000000003fe] 10202 1727204062.00789: sending task result for task 127b8e07-fff9-0b04-2570-0000000003fe skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204062.00937: no more pending results, returning what we have 10202 1727204062.00942: results queue empty 10202 1727204062.00944: checking for any_errors_fatal 10202 1727204062.00954: done checking for any_errors_fatal 10202 1727204062.00954: checking for max_fail_percentage 10202 1727204062.00956: done checking for max_fail_percentage 10202 1727204062.00957: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.00958: done checking to see if all hosts have failed 10202 1727204062.00959: getting the remaining hosts for this loop 10202 1727204062.00961: done getting the remaining hosts for this loop 10202 1727204062.00967: getting the next task for host managed-node3 10202 1727204062.00975: done getting next task for host managed-node3 10202 1727204062.00978: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204062.00983: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.00988: getting variables 10202 1727204062.00990: in VariableManager get_vars() 10202 1727204062.01041: Calling all_inventory to load vars for managed-node3 10202 1727204062.01044: Calling groups_inventory to load vars for managed-node3 10202 1727204062.01046: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.01062: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.01065: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.01187: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003fe 10202 1727204062.01190: WORKER PROCESS EXITING 10202 1727204062.01196: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.04747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.09768: done with get_vars() 10202 1727204062.09814: done getting variables 10202 1727204062.10009: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.10257: variable 'profile' from source: include params 10202 1727204062.10262: variable 'item' from source: include params 10202 1727204062.10464: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.132) 0:00:23.781 ***** 10202 1727204062.10581: entering _queue_task() for managed-node3/set_fact 10202 1727204062.11434: worker is 1 (out of 1 available) 10202 1727204062.11451: exiting _queue_task() for managed-node3/set_fact 10202 1727204062.11591: done queuing things up, now waiting for results queue to drain 10202 1727204062.11593: waiting for pending results... 10202 1727204062.12036: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 10202 1727204062.12263: in run() - task 127b8e07-fff9-0b04-2570-0000000003ff 10202 1727204062.12273: variable 'ansible_search_path' from source: unknown 10202 1727204062.12277: variable 'ansible_search_path' from source: unknown 10202 1727204062.12430: calling self._execute() 10202 1727204062.12612: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.12867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.12874: variable 'omit' from source: magic vars 10202 1727204062.13517: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.13651: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.13906: variable 'profile_stat' from source: set_fact 10202 1727204062.13979: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204062.13988: when evaluation is False, skipping this task 10202 1727204062.14173: _execute() done 10202 1727204062.14177: dumping result to json 10202 1727204062.14179: done dumping result, returning 10202 1727204062.14182: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [127b8e07-fff9-0b04-2570-0000000003ff] 10202 1727204062.14185: sending task result for task 127b8e07-fff9-0b04-2570-0000000003ff skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204062.14311: no more pending results, returning what we have 10202 1727204062.14315: results queue empty 10202 1727204062.14316: checking for any_errors_fatal 10202 1727204062.14324: done checking for any_errors_fatal 10202 1727204062.14325: checking for max_fail_percentage 10202 1727204062.14326: done checking for max_fail_percentage 10202 1727204062.14331: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.14332: done checking to see if all hosts have failed 10202 1727204062.14333: getting the remaining hosts for this loop 10202 1727204062.14336: done getting the remaining hosts for this loop 10202 1727204062.14341: getting the next task for host managed-node3 10202 1727204062.14349: done getting next task for host managed-node3 10202 1727204062.14352: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10202 1727204062.14358: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.14364: getting variables 10202 1727204062.14571: in VariableManager get_vars() 10202 1727204062.14621: Calling all_inventory to load vars for managed-node3 10202 1727204062.14624: Calling groups_inventory to load vars for managed-node3 10202 1727204062.14626: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.14646: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.14649: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.14652: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.15403: done sending task result for task 127b8e07-fff9-0b04-2570-0000000003ff 10202 1727204062.15409: WORKER PROCESS EXITING 10202 1727204062.18208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.22859: done with get_vars() 10202 1727204062.22906: done getting variables 10202 1727204062.23177: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.23418: variable 'profile' from source: include params 10202 1727204062.23422: variable 'item' from source: include params 10202 1727204062.23600: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.130) 0:00:23.911 ***** 10202 1727204062.23640: entering _queue_task() for managed-node3/command 10202 1727204062.24456: worker is 1 (out of 1 available) 10202 1727204062.24473: exiting _queue_task() for managed-node3/command 10202 1727204062.24491: done queuing things up, now waiting for results queue to drain 10202 1727204062.24492: waiting for pending results... 10202 1727204062.25040: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 10202 1727204062.25258: in run() - task 127b8e07-fff9-0b04-2570-000000000400 10202 1727204062.25326: variable 'ansible_search_path' from source: unknown 10202 1727204062.25333: variable 'ansible_search_path' from source: unknown 10202 1727204062.25369: calling self._execute() 10202 1727204062.25676: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.25683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.25696: variable 'omit' from source: magic vars 10202 1727204062.26570: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.26583: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.26880: variable 'profile_stat' from source: set_fact 10202 1727204062.26897: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204062.26900: when evaluation is False, skipping this task 10202 1727204062.26904: _execute() done 10202 1727204062.26907: dumping result to json 10202 1727204062.26910: done dumping result, returning 10202 1727204062.26920: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [127b8e07-fff9-0b04-2570-000000000400] 10202 1727204062.26923: sending task result for task 127b8e07-fff9-0b04-2570-000000000400 10202 1727204062.27103: done sending task result for task 127b8e07-fff9-0b04-2570-000000000400 10202 1727204062.27107: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204062.27194: no more pending results, returning what we have 10202 1727204062.27199: results queue empty 10202 1727204062.27200: checking for any_errors_fatal 10202 1727204062.27208: done checking for any_errors_fatal 10202 1727204062.27209: checking for max_fail_percentage 10202 1727204062.27212: done checking for max_fail_percentage 10202 1727204062.27213: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.27215: done checking to see if all hosts have failed 10202 1727204062.27220: getting the remaining hosts for this loop 10202 1727204062.27223: done getting the remaining hosts for this loop 10202 1727204062.27230: getting the next task for host managed-node3 10202 1727204062.27239: done getting next task for host managed-node3 10202 1727204062.27242: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10202 1727204062.27248: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.27254: getting variables 10202 1727204062.27256: in VariableManager get_vars() 10202 1727204062.27304: Calling all_inventory to load vars for managed-node3 10202 1727204062.27307: Calling groups_inventory to load vars for managed-node3 10202 1727204062.27309: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.27482: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.27487: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.27492: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.31517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.36133: done with get_vars() 10202 1727204062.36293: done getting variables 10202 1727204062.36371: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.36745: variable 'profile' from source: include params 10202 1727204062.36750: variable 'item' from source: include params 10202 1727204062.36870: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.132) 0:00:24.044 ***** 10202 1727204062.36907: entering _queue_task() for managed-node3/set_fact 10202 1727204062.37738: worker is 1 (out of 1 available) 10202 1727204062.37752: exiting _queue_task() for managed-node3/set_fact 10202 1727204062.37919: done queuing things up, now waiting for results queue to drain 10202 1727204062.37921: waiting for pending results... 10202 1727204062.38192: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 10202 1727204062.38575: in run() - task 127b8e07-fff9-0b04-2570-000000000401 10202 1727204062.38583: variable 'ansible_search_path' from source: unknown 10202 1727204062.38586: variable 'ansible_search_path' from source: unknown 10202 1727204062.38588: calling self._execute() 10202 1727204062.38902: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.38906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.39010: variable 'omit' from source: magic vars 10202 1727204062.39806: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.39817: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.40213: variable 'profile_stat' from source: set_fact 10202 1727204062.40220: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204062.40223: when evaluation is False, skipping this task 10202 1727204062.40226: _execute() done 10202 1727204062.40230: dumping result to json 10202 1727204062.40232: done dumping result, returning 10202 1727204062.40235: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [127b8e07-fff9-0b04-2570-000000000401] 10202 1727204062.40237: sending task result for task 127b8e07-fff9-0b04-2570-000000000401 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204062.40511: no more pending results, returning what we have 10202 1727204062.40516: results queue empty 10202 1727204062.40517: checking for any_errors_fatal 10202 1727204062.40524: done checking for any_errors_fatal 10202 1727204062.40525: checking for max_fail_percentage 10202 1727204062.40526: done checking for max_fail_percentage 10202 1727204062.40531: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.40532: done checking to see if all hosts have failed 10202 1727204062.40533: getting the remaining hosts for this loop 10202 1727204062.40535: done getting the remaining hosts for this loop 10202 1727204062.40540: getting the next task for host managed-node3 10202 1727204062.40550: done getting next task for host managed-node3 10202 1727204062.40553: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10202 1727204062.40557: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.40561: getting variables 10202 1727204062.40563: in VariableManager get_vars() 10202 1727204062.40610: Calling all_inventory to load vars for managed-node3 10202 1727204062.40613: Calling groups_inventory to load vars for managed-node3 10202 1727204062.40615: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.40633: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.40636: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.40640: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.41221: done sending task result for task 127b8e07-fff9-0b04-2570-000000000401 10202 1727204062.41226: WORKER PROCESS EXITING 10202 1727204062.44415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.49451: done with get_vars() 10202 1727204062.49607: done getting variables 10202 1727204062.49682: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.50061: variable 'profile' from source: include params 10202 1727204062.50070: variable 'item' from source: include params 10202 1727204062.50194: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.133) 0:00:24.177 ***** 10202 1727204062.50351: entering _queue_task() for managed-node3/assert 10202 1727204062.51253: worker is 1 (out of 1 available) 10202 1727204062.51271: exiting _queue_task() for managed-node3/assert 10202 1727204062.51286: done queuing things up, now waiting for results queue to drain 10202 1727204062.51288: waiting for pending results... 10202 1727204062.51699: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' 10202 1727204062.51914: in run() - task 127b8e07-fff9-0b04-2570-000000000267 10202 1727204062.51933: variable 'ansible_search_path' from source: unknown 10202 1727204062.51937: variable 'ansible_search_path' from source: unknown 10202 1727204062.52005: calling self._execute() 10202 1727204062.52278: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.52299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.52334: variable 'omit' from source: magic vars 10202 1727204062.52804: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.52832: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.52852: variable 'omit' from source: magic vars 10202 1727204062.52963: variable 'omit' from source: magic vars 10202 1727204062.53035: variable 'profile' from source: include params 10202 1727204062.53045: variable 'item' from source: include params 10202 1727204062.53125: variable 'item' from source: include params 10202 1727204062.53154: variable 'omit' from source: magic vars 10202 1727204062.53216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204062.53264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204062.53299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204062.53401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.53404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.53407: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204062.53410: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.53412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.53513: Set connection var ansible_shell_type to sh 10202 1727204062.53524: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204062.53539: Set connection var ansible_connection to ssh 10202 1727204062.53547: Set connection var ansible_shell_executable to /bin/sh 10202 1727204062.53557: Set connection var ansible_pipelining to False 10202 1727204062.53567: Set connection var ansible_timeout to 10 10202 1727204062.53594: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.53601: variable 'ansible_connection' from source: unknown 10202 1727204062.53606: variable 'ansible_module_compression' from source: unknown 10202 1727204062.53617: variable 'ansible_shell_type' from source: unknown 10202 1727204062.53623: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.53631: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.53641: variable 'ansible_pipelining' from source: unknown 10202 1727204062.53648: variable 'ansible_timeout' from source: unknown 10202 1727204062.53654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.53839: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204062.53948: variable 'omit' from source: magic vars 10202 1727204062.53951: starting attempt loop 10202 1727204062.53954: running the handler 10202 1727204062.54035: variable 'lsr_net_profile_exists' from source: set_fact 10202 1727204062.54047: Evaluated conditional (lsr_net_profile_exists): True 10202 1727204062.54064: handler run complete 10202 1727204062.54089: attempt loop complete, returning result 10202 1727204062.54097: _execute() done 10202 1727204062.54105: dumping result to json 10202 1727204062.54113: done dumping result, returning 10202 1727204062.54125: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' [127b8e07-fff9-0b04-2570-000000000267] 10202 1727204062.54140: sending task result for task 127b8e07-fff9-0b04-2570-000000000267 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204062.54455: no more pending results, returning what we have 10202 1727204062.54459: results queue empty 10202 1727204062.54460: checking for any_errors_fatal 10202 1727204062.54471: done checking for any_errors_fatal 10202 1727204062.54472: checking for max_fail_percentage 10202 1727204062.54473: done checking for max_fail_percentage 10202 1727204062.54474: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.54476: done checking to see if all hosts have failed 10202 1727204062.54476: getting the remaining hosts for this loop 10202 1727204062.54479: done getting the remaining hosts for this loop 10202 1727204062.54484: getting the next task for host managed-node3 10202 1727204062.54495: done getting next task for host managed-node3 10202 1727204062.54498: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10202 1727204062.54502: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.54507: getting variables 10202 1727204062.54509: in VariableManager get_vars() 10202 1727204062.54563: Calling all_inventory to load vars for managed-node3 10202 1727204062.54716: Calling groups_inventory to load vars for managed-node3 10202 1727204062.54720: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.54755: done sending task result for task 127b8e07-fff9-0b04-2570-000000000267 10202 1727204062.54759: WORKER PROCESS EXITING 10202 1727204062.54775: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.54805: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.54811: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.56824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.59398: done with get_vars() 10202 1727204062.59440: done getting variables 10202 1727204062.59520: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.59714: variable 'profile' from source: include params 10202 1727204062.59718: variable 'item' from source: include params 10202 1727204062.59807: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.095) 0:00:24.273 ***** 10202 1727204062.59871: entering _queue_task() for managed-node3/assert 10202 1727204062.60395: worker is 1 (out of 1 available) 10202 1727204062.60410: exiting _queue_task() for managed-node3/assert 10202 1727204062.60424: done queuing things up, now waiting for results queue to drain 10202 1727204062.60425: waiting for pending results... 10202 1727204062.60655: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 10202 1727204062.60793: in run() - task 127b8e07-fff9-0b04-2570-000000000268 10202 1727204062.60901: variable 'ansible_search_path' from source: unknown 10202 1727204062.60906: variable 'ansible_search_path' from source: unknown 10202 1727204062.60909: calling self._execute() 10202 1727204062.60985: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.60999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.61023: variable 'omit' from source: magic vars 10202 1727204062.61460: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.61482: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.61496: variable 'omit' from source: magic vars 10202 1727204062.61553: variable 'omit' from source: magic vars 10202 1727204062.61680: variable 'profile' from source: include params 10202 1727204062.61772: variable 'item' from source: include params 10202 1727204062.61777: variable 'item' from source: include params 10202 1727204062.61794: variable 'omit' from source: magic vars 10202 1727204062.61890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204062.61994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204062.61998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204062.62002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.62004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.62103: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204062.62106: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.62109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.62344: Set connection var ansible_shell_type to sh 10202 1727204062.62360: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204062.62380: Set connection var ansible_connection to ssh 10202 1727204062.62392: Set connection var ansible_shell_executable to /bin/sh 10202 1727204062.62403: Set connection var ansible_pipelining to False 10202 1727204062.62475: Set connection var ansible_timeout to 10 10202 1727204062.62478: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.62480: variable 'ansible_connection' from source: unknown 10202 1727204062.62483: variable 'ansible_module_compression' from source: unknown 10202 1727204062.62485: variable 'ansible_shell_type' from source: unknown 10202 1727204062.62487: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.62489: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.62491: variable 'ansible_pipelining' from source: unknown 10202 1727204062.62493: variable 'ansible_timeout' from source: unknown 10202 1727204062.62497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.62667: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204062.62690: variable 'omit' from source: magic vars 10202 1727204062.62704: starting attempt loop 10202 1727204062.62712: running the handler 10202 1727204062.62855: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10202 1727204062.62908: Evaluated conditional (lsr_net_profile_ansible_managed): True 10202 1727204062.62911: handler run complete 10202 1727204062.62916: attempt loop complete, returning result 10202 1727204062.62918: _execute() done 10202 1727204062.62920: dumping result to json 10202 1727204062.62922: done dumping result, returning 10202 1727204062.62970: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [127b8e07-fff9-0b04-2570-000000000268] 10202 1727204062.62973: sending task result for task 127b8e07-fff9-0b04-2570-000000000268 10202 1727204062.63079: done sending task result for task 127b8e07-fff9-0b04-2570-000000000268 10202 1727204062.63083: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204062.63142: no more pending results, returning what we have 10202 1727204062.63146: results queue empty 10202 1727204062.63148: checking for any_errors_fatal 10202 1727204062.63159: done checking for any_errors_fatal 10202 1727204062.63159: checking for max_fail_percentage 10202 1727204062.63161: done checking for max_fail_percentage 10202 1727204062.63163: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.63164: done checking to see if all hosts have failed 10202 1727204062.63165: getting the remaining hosts for this loop 10202 1727204062.63169: done getting the remaining hosts for this loop 10202 1727204062.63174: getting the next task for host managed-node3 10202 1727204062.63181: done getting next task for host managed-node3 10202 1727204062.63184: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10202 1727204062.63187: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.63192: getting variables 10202 1727204062.63194: in VariableManager get_vars() 10202 1727204062.63242: Calling all_inventory to load vars for managed-node3 10202 1727204062.63245: Calling groups_inventory to load vars for managed-node3 10202 1727204062.63248: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.63261: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.63264: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.63479: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.65533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.67734: done with get_vars() 10202 1727204062.67775: done getting variables 10202 1727204062.67847: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204062.68047: variable 'profile' from source: include params 10202 1727204062.68051: variable 'item' from source: include params 10202 1727204062.68117: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.084) 0:00:24.358 ***** 10202 1727204062.68277: entering _queue_task() for managed-node3/assert 10202 1727204062.69116: worker is 1 (out of 1 available) 10202 1727204062.69129: exiting _queue_task() for managed-node3/assert 10202 1727204062.69144: done queuing things up, now waiting for results queue to drain 10202 1727204062.69145: waiting for pending results... 10202 1727204062.69786: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 10202 1727204062.70096: in run() - task 127b8e07-fff9-0b04-2570-000000000269 10202 1727204062.70102: variable 'ansible_search_path' from source: unknown 10202 1727204062.70106: variable 'ansible_search_path' from source: unknown 10202 1727204062.70109: calling self._execute() 10202 1727204062.70260: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.70278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.70296: variable 'omit' from source: magic vars 10202 1727204062.71022: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.71046: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.71069: variable 'omit' from source: magic vars 10202 1727204062.71120: variable 'omit' from source: magic vars 10202 1727204062.71258: variable 'profile' from source: include params 10202 1727204062.71472: variable 'item' from source: include params 10202 1727204062.71476: variable 'item' from source: include params 10202 1727204062.71479: variable 'omit' from source: magic vars 10202 1727204062.71481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204062.71484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204062.71507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204062.71533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.71553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.71605: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204062.71617: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.71627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.71755: Set connection var ansible_shell_type to sh 10202 1727204062.71773: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204062.71786: Set connection var ansible_connection to ssh 10202 1727204062.71797: Set connection var ansible_shell_executable to /bin/sh 10202 1727204062.71807: Set connection var ansible_pipelining to False 10202 1727204062.71826: Set connection var ansible_timeout to 10 10202 1727204062.71861: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.71874: variable 'ansible_connection' from source: unknown 10202 1727204062.71884: variable 'ansible_module_compression' from source: unknown 10202 1727204062.71892: variable 'ansible_shell_type' from source: unknown 10202 1727204062.71900: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.71929: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.71933: variable 'ansible_pipelining' from source: unknown 10202 1727204062.71936: variable 'ansible_timeout' from source: unknown 10202 1727204062.71938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.72113: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204062.72148: variable 'omit' from source: magic vars 10202 1727204062.72171: starting attempt loop 10202 1727204062.72174: running the handler 10202 1727204062.72306: variable 'lsr_net_profile_fingerprint' from source: set_fact 10202 1727204062.72366: Evaluated conditional (lsr_net_profile_fingerprint): True 10202 1727204062.72372: handler run complete 10202 1727204062.72374: attempt loop complete, returning result 10202 1727204062.72376: _execute() done 10202 1727204062.72379: dumping result to json 10202 1727204062.72381: done dumping result, returning 10202 1727204062.72387: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 [127b8e07-fff9-0b04-2570-000000000269] 10202 1727204062.72398: sending task result for task 127b8e07-fff9-0b04-2570-000000000269 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204062.72677: no more pending results, returning what we have 10202 1727204062.72681: results queue empty 10202 1727204062.72683: checking for any_errors_fatal 10202 1727204062.72695: done checking for any_errors_fatal 10202 1727204062.72696: checking for max_fail_percentage 10202 1727204062.72698: done checking for max_fail_percentage 10202 1727204062.72699: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.72700: done checking to see if all hosts have failed 10202 1727204062.72702: getting the remaining hosts for this loop 10202 1727204062.72704: done getting the remaining hosts for this loop 10202 1727204062.72709: getting the next task for host managed-node3 10202 1727204062.72720: done getting next task for host managed-node3 10202 1727204062.72724: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10202 1727204062.72727: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.72732: getting variables 10202 1727204062.72734: in VariableManager get_vars() 10202 1727204062.72986: Calling all_inventory to load vars for managed-node3 10202 1727204062.72989: Calling groups_inventory to load vars for managed-node3 10202 1727204062.72992: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.73007: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.73012: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.73015: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.73677: done sending task result for task 127b8e07-fff9-0b04-2570-000000000269 10202 1727204062.73682: WORKER PROCESS EXITING 10202 1727204062.76545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.78940: done with get_vars() 10202 1727204062.78971: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.108) 0:00:24.466 ***** 10202 1727204062.79087: entering _queue_task() for managed-node3/include_tasks 10202 1727204062.79492: worker is 1 (out of 1 available) 10202 1727204062.79508: exiting _queue_task() for managed-node3/include_tasks 10202 1727204062.79524: done queuing things up, now waiting for results queue to drain 10202 1727204062.79525: waiting for pending results... 10202 1727204062.79843: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10202 1727204062.79981: in run() - task 127b8e07-fff9-0b04-2570-00000000026d 10202 1727204062.79998: variable 'ansible_search_path' from source: unknown 10202 1727204062.80002: variable 'ansible_search_path' from source: unknown 10202 1727204062.80043: calling self._execute() 10202 1727204062.80152: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.80159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.80230: variable 'omit' from source: magic vars 10202 1727204062.80667: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.80674: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.80684: _execute() done 10202 1727204062.80688: dumping result to json 10202 1727204062.80690: done dumping result, returning 10202 1727204062.80699: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-0b04-2570-00000000026d] 10202 1727204062.80705: sending task result for task 127b8e07-fff9-0b04-2570-00000000026d 10202 1727204062.80846: done sending task result for task 127b8e07-fff9-0b04-2570-00000000026d 10202 1727204062.80849: WORKER PROCESS EXITING 10202 1727204062.80907: no more pending results, returning what we have 10202 1727204062.80911: in VariableManager get_vars() 10202 1727204062.80963: Calling all_inventory to load vars for managed-node3 10202 1727204062.80969: Calling groups_inventory to load vars for managed-node3 10202 1727204062.80971: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.80986: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.80989: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.80992: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.88729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.90904: done with get_vars() 10202 1727204062.90938: variable 'ansible_search_path' from source: unknown 10202 1727204062.90940: variable 'ansible_search_path' from source: unknown 10202 1727204062.90991: we have included files to process 10202 1727204062.90993: generating all_blocks data 10202 1727204062.90994: done generating all_blocks data 10202 1727204062.90998: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204062.90999: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204062.91001: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10202 1727204062.92045: done processing included file 10202 1727204062.92048: iterating over new_blocks loaded from include file 10202 1727204062.92050: in VariableManager get_vars() 10202 1727204062.92077: done with get_vars() 10202 1727204062.92079: filtering new block on tags 10202 1727204062.92106: done filtering new block on tags 10202 1727204062.92110: in VariableManager get_vars() 10202 1727204062.92132: done with get_vars() 10202 1727204062.92133: filtering new block on tags 10202 1727204062.92163: done filtering new block on tags 10202 1727204062.92168: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10202 1727204062.92174: extending task lists for all hosts with included blocks 10202 1727204062.92355: done extending task lists 10202 1727204062.92357: done processing included files 10202 1727204062.92357: results queue empty 10202 1727204062.92358: checking for any_errors_fatal 10202 1727204062.92361: done checking for any_errors_fatal 10202 1727204062.92362: checking for max_fail_percentage 10202 1727204062.92363: done checking for max_fail_percentage 10202 1727204062.92364: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.92364: done checking to see if all hosts have failed 10202 1727204062.92370: getting the remaining hosts for this loop 10202 1727204062.92371: done getting the remaining hosts for this loop 10202 1727204062.92373: getting the next task for host managed-node3 10202 1727204062.92377: done getting next task for host managed-node3 10202 1727204062.92379: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204062.92381: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.92383: getting variables 10202 1727204062.92384: in VariableManager get_vars() 10202 1727204062.92397: Calling all_inventory to load vars for managed-node3 10202 1727204062.92399: Calling groups_inventory to load vars for managed-node3 10202 1727204062.92401: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.92406: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.92408: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.92412: Calling groups_plugins_play to load vars for managed-node3 10202 1727204062.94035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204062.96253: done with get_vars() 10202 1727204062.96297: done getting variables 10202 1727204062.96354: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.173) 0:00:24.639 ***** 10202 1727204062.96394: entering _queue_task() for managed-node3/set_fact 10202 1727204062.96792: worker is 1 (out of 1 available) 10202 1727204062.96812: exiting _queue_task() for managed-node3/set_fact 10202 1727204062.96828: done queuing things up, now waiting for results queue to drain 10202 1727204062.96830: waiting for pending results... 10202 1727204062.97269: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10202 1727204062.97278: in run() - task 127b8e07-fff9-0b04-2570-000000000440 10202 1727204062.97298: variable 'ansible_search_path' from source: unknown 10202 1727204062.97307: variable 'ansible_search_path' from source: unknown 10202 1727204062.97364: calling self._execute() 10202 1727204062.97490: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.97505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.97522: variable 'omit' from source: magic vars 10202 1727204062.97984: variable 'ansible_distribution_major_version' from source: facts 10202 1727204062.98014: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204062.98070: variable 'omit' from source: magic vars 10202 1727204062.98091: variable 'omit' from source: magic vars 10202 1727204062.98147: variable 'omit' from source: magic vars 10202 1727204062.98202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204062.98257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204062.98287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204062.98313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.98444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204062.98448: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204062.98451: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.98454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.98516: Set connection var ansible_shell_type to sh 10202 1727204062.98528: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204062.98540: Set connection var ansible_connection to ssh 10202 1727204062.98551: Set connection var ansible_shell_executable to /bin/sh 10202 1727204062.98570: Set connection var ansible_pipelining to False 10202 1727204062.98584: Set connection var ansible_timeout to 10 10202 1727204062.98615: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.98624: variable 'ansible_connection' from source: unknown 10202 1727204062.98632: variable 'ansible_module_compression' from source: unknown 10202 1727204062.98640: variable 'ansible_shell_type' from source: unknown 10202 1727204062.98647: variable 'ansible_shell_executable' from source: unknown 10202 1727204062.98655: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204062.98664: variable 'ansible_pipelining' from source: unknown 10202 1727204062.98685: variable 'ansible_timeout' from source: unknown 10202 1727204062.98783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204062.98862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204062.98887: variable 'omit' from source: magic vars 10202 1727204062.98905: starting attempt loop 10202 1727204062.98913: running the handler 10202 1727204062.98932: handler run complete 10202 1727204062.98947: attempt loop complete, returning result 10202 1727204062.98954: _execute() done 10202 1727204062.98962: dumping result to json 10202 1727204062.98974: done dumping result, returning 10202 1727204062.98986: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-0b04-2570-000000000440] 10202 1727204062.99003: sending task result for task 127b8e07-fff9-0b04-2570-000000000440 10202 1727204062.99180: done sending task result for task 127b8e07-fff9-0b04-2570-000000000440 10202 1727204062.99184: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10202 1727204062.99250: no more pending results, returning what we have 10202 1727204062.99254: results queue empty 10202 1727204062.99255: checking for any_errors_fatal 10202 1727204062.99257: done checking for any_errors_fatal 10202 1727204062.99258: checking for max_fail_percentage 10202 1727204062.99260: done checking for max_fail_percentage 10202 1727204062.99261: checking to see if all hosts have failed and the running result is not ok 10202 1727204062.99262: done checking to see if all hosts have failed 10202 1727204062.99263: getting the remaining hosts for this loop 10202 1727204062.99267: done getting the remaining hosts for this loop 10202 1727204062.99272: getting the next task for host managed-node3 10202 1727204062.99280: done getting next task for host managed-node3 10202 1727204062.99283: ^ task is: TASK: Stat profile file 10202 1727204062.99288: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204062.99294: getting variables 10202 1727204062.99296: in VariableManager get_vars() 10202 1727204062.99458: Calling all_inventory to load vars for managed-node3 10202 1727204062.99461: Calling groups_inventory to load vars for managed-node3 10202 1727204062.99463: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204062.99554: Calling all_plugins_play to load vars for managed-node3 10202 1727204062.99558: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204062.99561: Calling groups_plugins_play to load vars for managed-node3 10202 1727204063.01328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204063.03648: done with get_vars() 10202 1727204063.03678: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.073) 0:00:24.713 ***** 10202 1727204063.03773: entering _queue_task() for managed-node3/stat 10202 1727204063.04170: worker is 1 (out of 1 available) 10202 1727204063.04187: exiting _queue_task() for managed-node3/stat 10202 1727204063.04202: done queuing things up, now waiting for results queue to drain 10202 1727204063.04203: waiting for pending results... 10202 1727204063.04677: running TaskExecutor() for managed-node3/TASK: Stat profile file 10202 1727204063.04687: in run() - task 127b8e07-fff9-0b04-2570-000000000441 10202 1727204063.04712: variable 'ansible_search_path' from source: unknown 10202 1727204063.04721: variable 'ansible_search_path' from source: unknown 10202 1727204063.04777: calling self._execute() 10202 1727204063.04901: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.04914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.04929: variable 'omit' from source: magic vars 10202 1727204063.05375: variable 'ansible_distribution_major_version' from source: facts 10202 1727204063.05393: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204063.05405: variable 'omit' from source: magic vars 10202 1727204063.05527: variable 'omit' from source: magic vars 10202 1727204063.05589: variable 'profile' from source: include params 10202 1727204063.05600: variable 'item' from source: include params 10202 1727204063.05680: variable 'item' from source: include params 10202 1727204063.05707: variable 'omit' from source: magic vars 10202 1727204063.05767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204063.05813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204063.05841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204063.05875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204063.05962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204063.05970: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204063.05974: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.05976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.06061: Set connection var ansible_shell_type to sh 10202 1727204063.06082: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204063.06094: Set connection var ansible_connection to ssh 10202 1727204063.06105: Set connection var ansible_shell_executable to /bin/sh 10202 1727204063.06116: Set connection var ansible_pipelining to False 10202 1727204063.06128: Set connection var ansible_timeout to 10 10202 1727204063.06158: variable 'ansible_shell_executable' from source: unknown 10202 1727204063.06168: variable 'ansible_connection' from source: unknown 10202 1727204063.06178: variable 'ansible_module_compression' from source: unknown 10202 1727204063.06194: variable 'ansible_shell_type' from source: unknown 10202 1727204063.06202: variable 'ansible_shell_executable' from source: unknown 10202 1727204063.06294: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.06298: variable 'ansible_pipelining' from source: unknown 10202 1727204063.06306: variable 'ansible_timeout' from source: unknown 10202 1727204063.06309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.06473: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204063.06491: variable 'omit' from source: magic vars 10202 1727204063.06501: starting attempt loop 10202 1727204063.06514: running the handler 10202 1727204063.06540: _low_level_execute_command(): starting 10202 1727204063.06553: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204063.07407: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.07462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.07499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.07515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.07629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.09503: stdout chunk (state=3): >>>/root <<< 10202 1727204063.09725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.09733: stdout chunk (state=3): >>><<< 10202 1727204063.09736: stderr chunk (state=3): >>><<< 10202 1727204063.09772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.09886: _low_level_execute_command(): starting 10202 1727204063.09890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663 `" && echo ansible-tmp-1727204063.097638-11628-229087009288663="` echo /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663 `" ) && sleep 0' 10202 1727204063.10778: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.10799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.10815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.10916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.10957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.10976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.10997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.11125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.13283: stdout chunk (state=3): >>>ansible-tmp-1727204063.097638-11628-229087009288663=/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663 <<< 10202 1727204063.13471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.13498: stdout chunk (state=3): >>><<< 10202 1727204063.13501: stderr chunk (state=3): >>><<< 10202 1727204063.13517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.097638-11628-229087009288663=/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.13672: variable 'ansible_module_compression' from source: unknown 10202 1727204063.13675: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10202 1727204063.13724: variable 'ansible_facts' from source: unknown 10202 1727204063.13819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py 10202 1727204063.14144: Sending initial data 10202 1727204063.14147: Sent initial data (152 bytes) 10202 1727204063.14672: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.14688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.14708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.14768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.14826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.14908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.16704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204063.16797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204063.16894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpkr66l5lf /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py <<< 10202 1727204063.16897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py" <<< 10202 1727204063.16980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpkr66l5lf" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py" <<< 10202 1727204063.17863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.17989: stderr chunk (state=3): >>><<< 10202 1727204063.17998: stdout chunk (state=3): >>><<< 10202 1727204063.18027: done transferring module to remote 10202 1727204063.18042: _low_level_execute_command(): starting 10202 1727204063.18045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/ /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py && sleep 0' 10202 1727204063.18705: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.18709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.18712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.18772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204063.18776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204063.18779: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204063.18781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.18784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204063.18786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204063.18789: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204063.18798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.18808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.18822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204063.18931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204063.18949: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204063.18952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.18955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.18957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.19003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.19086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.21139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.21280: stderr chunk (state=3): >>><<< 10202 1727204063.21298: stdout chunk (state=3): >>><<< 10202 1727204063.21324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.21419: _low_level_execute_command(): starting 10202 1727204063.21423: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/AnsiballZ_stat.py && sleep 0' 10202 1727204063.22122: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.22164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204063.22259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204063.22282: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.22305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.22325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.22349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.22467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.40118: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10202 1727204063.41820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204063.41824: stdout chunk (state=3): >>><<< 10202 1727204063.41827: stderr chunk (state=3): >>><<< 10202 1727204063.41829: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204063.41833: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204063.41836: _low_level_execute_command(): starting 10202 1727204063.41838: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.097638-11628-229087009288663/ > /dev/null 2>&1 && sleep 0' 10202 1727204063.42490: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.42566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.42595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.42629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.42732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.44931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.44958: stderr chunk (state=3): >>><<< 10202 1727204063.44970: stdout chunk (state=3): >>><<< 10202 1727204063.45147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.45151: handler run complete 10202 1727204063.45153: attempt loop complete, returning result 10202 1727204063.45155: _execute() done 10202 1727204063.45157: dumping result to json 10202 1727204063.45159: done dumping result, returning 10202 1727204063.45160: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-0b04-2570-000000000441] 10202 1727204063.45162: sending task result for task 127b8e07-fff9-0b04-2570-000000000441 10202 1727204063.45242: done sending task result for task 127b8e07-fff9-0b04-2570-000000000441 10202 1727204063.45245: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10202 1727204063.45327: no more pending results, returning what we have 10202 1727204063.45330: results queue empty 10202 1727204063.45331: checking for any_errors_fatal 10202 1727204063.45337: done checking for any_errors_fatal 10202 1727204063.45337: checking for max_fail_percentage 10202 1727204063.45339: done checking for max_fail_percentage 10202 1727204063.45340: checking to see if all hosts have failed and the running result is not ok 10202 1727204063.45341: done checking to see if all hosts have failed 10202 1727204063.45341: getting the remaining hosts for this loop 10202 1727204063.45343: done getting the remaining hosts for this loop 10202 1727204063.45346: getting the next task for host managed-node3 10202 1727204063.45352: done getting next task for host managed-node3 10202 1727204063.45354: ^ task is: TASK: Set NM profile exist flag based on the profile files 10202 1727204063.45358: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204063.45362: getting variables 10202 1727204063.45363: in VariableManager get_vars() 10202 1727204063.45426: Calling all_inventory to load vars for managed-node3 10202 1727204063.45429: Calling groups_inventory to load vars for managed-node3 10202 1727204063.45431: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204063.45442: Calling all_plugins_play to load vars for managed-node3 10202 1727204063.45445: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204063.45447: Calling groups_plugins_play to load vars for managed-node3 10202 1727204063.47404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204063.49604: done with get_vars() 10202 1727204063.49645: done getting variables 10202 1727204063.49715: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.459) 0:00:25.172 ***** 10202 1727204063.49753: entering _queue_task() for managed-node3/set_fact 10202 1727204063.50147: worker is 1 (out of 1 available) 10202 1727204063.50162: exiting _queue_task() for managed-node3/set_fact 10202 1727204063.50179: done queuing things up, now waiting for results queue to drain 10202 1727204063.50181: waiting for pending results... 10202 1727204063.50591: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10202 1727204063.50600: in run() - task 127b8e07-fff9-0b04-2570-000000000442 10202 1727204063.50604: variable 'ansible_search_path' from source: unknown 10202 1727204063.50607: variable 'ansible_search_path' from source: unknown 10202 1727204063.50638: calling self._execute() 10202 1727204063.50758: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.50776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.50794: variable 'omit' from source: magic vars 10202 1727204063.51219: variable 'ansible_distribution_major_version' from source: facts 10202 1727204063.51238: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204063.51388: variable 'profile_stat' from source: set_fact 10202 1727204063.51410: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204063.51419: when evaluation is False, skipping this task 10202 1727204063.51426: _execute() done 10202 1727204063.51484: dumping result to json 10202 1727204063.51488: done dumping result, returning 10202 1727204063.51491: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-0b04-2570-000000000442] 10202 1727204063.51494: sending task result for task 127b8e07-fff9-0b04-2570-000000000442 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204063.51637: no more pending results, returning what we have 10202 1727204063.51641: results queue empty 10202 1727204063.51642: checking for any_errors_fatal 10202 1727204063.51653: done checking for any_errors_fatal 10202 1727204063.51654: checking for max_fail_percentage 10202 1727204063.51656: done checking for max_fail_percentage 10202 1727204063.51657: checking to see if all hosts have failed and the running result is not ok 10202 1727204063.51658: done checking to see if all hosts have failed 10202 1727204063.51658: getting the remaining hosts for this loop 10202 1727204063.51660: done getting the remaining hosts for this loop 10202 1727204063.51666: getting the next task for host managed-node3 10202 1727204063.51674: done getting next task for host managed-node3 10202 1727204063.51869: ^ task is: TASK: Get NM profile info 10202 1727204063.51874: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204063.51878: getting variables 10202 1727204063.51880: in VariableManager get_vars() 10202 1727204063.51919: Calling all_inventory to load vars for managed-node3 10202 1727204063.51922: Calling groups_inventory to load vars for managed-node3 10202 1727204063.51924: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204063.51934: done sending task result for task 127b8e07-fff9-0b04-2570-000000000442 10202 1727204063.51939: WORKER PROCESS EXITING 10202 1727204063.51950: Calling all_plugins_play to load vars for managed-node3 10202 1727204063.51953: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204063.51957: Calling groups_plugins_play to load vars for managed-node3 10202 1727204063.54195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204063.56385: done with get_vars() 10202 1727204063.56424: done getting variables 10202 1727204063.56495: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.067) 0:00:25.240 ***** 10202 1727204063.56534: entering _queue_task() for managed-node3/shell 10202 1727204063.56921: worker is 1 (out of 1 available) 10202 1727204063.56940: exiting _queue_task() for managed-node3/shell 10202 1727204063.56953: done queuing things up, now waiting for results queue to drain 10202 1727204063.56954: waiting for pending results... 10202 1727204063.57194: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10202 1727204063.57331: in run() - task 127b8e07-fff9-0b04-2570-000000000443 10202 1727204063.57353: variable 'ansible_search_path' from source: unknown 10202 1727204063.57361: variable 'ansible_search_path' from source: unknown 10202 1727204063.57427: calling self._execute() 10202 1727204063.57524: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.57641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.57645: variable 'omit' from source: magic vars 10202 1727204063.57986: variable 'ansible_distribution_major_version' from source: facts 10202 1727204063.58005: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204063.58019: variable 'omit' from source: magic vars 10202 1727204063.58075: variable 'omit' from source: magic vars 10202 1727204063.58196: variable 'profile' from source: include params 10202 1727204063.58209: variable 'item' from source: include params 10202 1727204063.58284: variable 'item' from source: include params 10202 1727204063.58472: variable 'omit' from source: magic vars 10202 1727204063.58475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204063.58480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204063.58483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204063.58486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204063.58489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204063.58512: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204063.58519: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.58525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.58630: Set connection var ansible_shell_type to sh 10202 1727204063.58641: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204063.58650: Set connection var ansible_connection to ssh 10202 1727204063.58658: Set connection var ansible_shell_executable to /bin/sh 10202 1727204063.58669: Set connection var ansible_pipelining to False 10202 1727204063.58678: Set connection var ansible_timeout to 10 10202 1727204063.58740: variable 'ansible_shell_executable' from source: unknown 10202 1727204063.58780: variable 'ansible_connection' from source: unknown 10202 1727204063.58788: variable 'ansible_module_compression' from source: unknown 10202 1727204063.58796: variable 'ansible_shell_type' from source: unknown 10202 1727204063.58803: variable 'ansible_shell_executable' from source: unknown 10202 1727204063.58811: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204063.58873: variable 'ansible_pipelining' from source: unknown 10202 1727204063.58877: variable 'ansible_timeout' from source: unknown 10202 1727204063.58879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204063.59012: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204063.59038: variable 'omit' from source: magic vars 10202 1727204063.59050: starting attempt loop 10202 1727204063.59057: running the handler 10202 1727204063.59076: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204063.59105: _low_level_execute_command(): starting 10202 1727204063.59150: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204063.60258: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.60284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.60311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.60485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.62327: stdout chunk (state=3): >>>/root <<< 10202 1727204063.62492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.62533: stderr chunk (state=3): >>><<< 10202 1727204063.62549: stdout chunk (state=3): >>><<< 10202 1727204063.62581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.62610: _low_level_execute_command(): starting 10202 1727204063.62703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358 `" && echo ansible-tmp-1727204063.6258872-11646-37607930587358="` echo /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358 `" ) && sleep 0' 10202 1727204063.63322: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.63386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.63495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.63596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.65776: stdout chunk (state=3): >>>ansible-tmp-1727204063.6258872-11646-37607930587358=/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358 <<< 10202 1727204063.65997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.66001: stdout chunk (state=3): >>><<< 10202 1727204063.66003: stderr chunk (state=3): >>><<< 10202 1727204063.66025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.6258872-11646-37607930587358=/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.66171: variable 'ansible_module_compression' from source: unknown 10202 1727204063.66174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204063.66177: variable 'ansible_facts' from source: unknown 10202 1727204063.66270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py 10202 1727204063.66536: Sending initial data 10202 1727204063.66540: Sent initial data (155 bytes) 10202 1727204063.67150: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.67173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.67286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.67290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.67319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.67342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.67360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.67458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.69270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204063.69383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204063.69457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmplukameoi /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py <<< 10202 1727204063.69461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py" <<< 10202 1727204063.69518: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmplukameoi" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py" <<< 10202 1727204063.70467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.70540: stderr chunk (state=3): >>><<< 10202 1727204063.70560: stdout chunk (state=3): >>><<< 10202 1727204063.70595: done transferring module to remote 10202 1727204063.70613: _low_level_execute_command(): starting 10202 1727204063.70624: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/ /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py && sleep 0' 10202 1727204063.71389: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.71449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204063.71472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.71523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.71596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.73854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204063.73918: stderr chunk (state=3): >>><<< 10202 1727204063.73926: stdout chunk (state=3): >>><<< 10202 1727204063.73976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204063.73981: _low_level_execute_command(): starting 10202 1727204063.73988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/AnsiballZ_command.py && sleep 0' 10202 1727204063.74688: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.74709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.74726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.74760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204063.74764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204063.74872: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.74925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.75014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204063.95140: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:23.924282", "end": "2024-09-24 14:54:23.948651", "delta": "0:00:00.024369", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204063.97316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204063.97320: stdout chunk (state=3): >>><<< 10202 1727204063.97323: stderr chunk (state=3): >>><<< 10202 1727204063.97326: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:23.924282", "end": "2024-09-24 14:54:23.948651", "delta": "0:00:00.024369", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204063.97332: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204063.97335: _low_level_execute_command(): starting 10202 1727204063.97337: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.6258872-11646-37607930587358/ > /dev/null 2>&1 && sleep 0' 10202 1727204063.98391: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204063.98501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204063.98517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204063.98541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204063.98557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204063.98571: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204063.98586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204063.98771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204063.98822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204063.98924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204064.01182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204064.01187: stdout chunk (state=3): >>><<< 10202 1727204064.01192: stderr chunk (state=3): >>><<< 10202 1727204064.01214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204064.01222: handler run complete 10202 1727204064.01249: Evaluated conditional (False): False 10202 1727204064.01260: attempt loop complete, returning result 10202 1727204064.01263: _execute() done 10202 1727204064.01269: dumping result to json 10202 1727204064.01392: done dumping result, returning 10202 1727204064.01396: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-0b04-2570-000000000443] 10202 1727204064.01398: sending task result for task 127b8e07-fff9-0b04-2570-000000000443 10202 1727204064.01624: done sending task result for task 127b8e07-fff9-0b04-2570-000000000443 10202 1727204064.01629: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.024369", "end": "2024-09-24 14:54:23.948651", "rc": 0, "start": "2024-09-24 14:54:23.924282" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10202 1727204064.01735: no more pending results, returning what we have 10202 1727204064.01738: results queue empty 10202 1727204064.01739: checking for any_errors_fatal 10202 1727204064.01744: done checking for any_errors_fatal 10202 1727204064.01745: checking for max_fail_percentage 10202 1727204064.01747: done checking for max_fail_percentage 10202 1727204064.01747: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.01749: done checking to see if all hosts have failed 10202 1727204064.01749: getting the remaining hosts for this loop 10202 1727204064.01751: done getting the remaining hosts for this loop 10202 1727204064.01755: getting the next task for host managed-node3 10202 1727204064.01761: done getting next task for host managed-node3 10202 1727204064.01764: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204064.01769: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.01774: getting variables 10202 1727204064.01775: in VariableManager get_vars() 10202 1727204064.01816: Calling all_inventory to load vars for managed-node3 10202 1727204064.01819: Calling groups_inventory to load vars for managed-node3 10202 1727204064.01822: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.01836: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.01839: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.01842: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.05786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.10351: done with get_vars() 10202 1727204064.10590: done getting variables 10202 1727204064.10663: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.541) 0:00:25.782 ***** 10202 1727204064.10701: entering _queue_task() for managed-node3/set_fact 10202 1727204064.11511: worker is 1 (out of 1 available) 10202 1727204064.11526: exiting _queue_task() for managed-node3/set_fact 10202 1727204064.11542: done queuing things up, now waiting for results queue to drain 10202 1727204064.11544: waiting for pending results... 10202 1727204064.12079: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10202 1727204064.12433: in run() - task 127b8e07-fff9-0b04-2570-000000000444 10202 1727204064.12447: variable 'ansible_search_path' from source: unknown 10202 1727204064.12450: variable 'ansible_search_path' from source: unknown 10202 1727204064.12492: calling self._execute() 10202 1727204064.12783: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.12789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.12801: variable 'omit' from source: magic vars 10202 1727204064.13757: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.13772: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.14111: variable 'nm_profile_exists' from source: set_fact 10202 1727204064.14179: Evaluated conditional (nm_profile_exists.rc == 0): True 10202 1727204064.14186: variable 'omit' from source: magic vars 10202 1727204064.14371: variable 'omit' from source: magic vars 10202 1727204064.14415: variable 'omit' from source: magic vars 10202 1727204064.14543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204064.14660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204064.14692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204064.14712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.14724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.14757: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204064.14761: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.14763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.15077: Set connection var ansible_shell_type to sh 10202 1727204064.15086: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204064.15095: Set connection var ansible_connection to ssh 10202 1727204064.15216: Set connection var ansible_shell_executable to /bin/sh 10202 1727204064.15223: Set connection var ansible_pipelining to False 10202 1727204064.15231: Set connection var ansible_timeout to 10 10202 1727204064.15257: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.15260: variable 'ansible_connection' from source: unknown 10202 1727204064.15263: variable 'ansible_module_compression' from source: unknown 10202 1727204064.15267: variable 'ansible_shell_type' from source: unknown 10202 1727204064.15277: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.15280: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.15318: variable 'ansible_pipelining' from source: unknown 10202 1727204064.15321: variable 'ansible_timeout' from source: unknown 10202 1727204064.15324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.15521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204064.15648: variable 'omit' from source: magic vars 10202 1727204064.15751: starting attempt loop 10202 1727204064.15754: running the handler 10202 1727204064.15756: handler run complete 10202 1727204064.15758: attempt loop complete, returning result 10202 1727204064.15761: _execute() done 10202 1727204064.15762: dumping result to json 10202 1727204064.15764: done dumping result, returning 10202 1727204064.15769: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-0b04-2570-000000000444] 10202 1727204064.15771: sending task result for task 127b8e07-fff9-0b04-2570-000000000444 10202 1727204064.15850: done sending task result for task 127b8e07-fff9-0b04-2570-000000000444 10202 1727204064.15853: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10202 1727204064.16032: no more pending results, returning what we have 10202 1727204064.16036: results queue empty 10202 1727204064.16038: checking for any_errors_fatal 10202 1727204064.16048: done checking for any_errors_fatal 10202 1727204064.16049: checking for max_fail_percentage 10202 1727204064.16051: done checking for max_fail_percentage 10202 1727204064.16052: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.16053: done checking to see if all hosts have failed 10202 1727204064.16054: getting the remaining hosts for this loop 10202 1727204064.16056: done getting the remaining hosts for this loop 10202 1727204064.16061: getting the next task for host managed-node3 10202 1727204064.16074: done getting next task for host managed-node3 10202 1727204064.16077: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204064.16082: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.16087: getting variables 10202 1727204064.16088: in VariableManager get_vars() 10202 1727204064.16136: Calling all_inventory to load vars for managed-node3 10202 1727204064.16139: Calling groups_inventory to load vars for managed-node3 10202 1727204064.16141: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.16153: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.16156: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.16159: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.19695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.23440: done with get_vars() 10202 1727204064.23482: done getting variables 10202 1727204064.23562: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.23703: variable 'profile' from source: include params 10202 1727204064.23707: variable 'item' from source: include params 10202 1727204064.23783: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.131) 0:00:25.913 ***** 10202 1727204064.23824: entering _queue_task() for managed-node3/command 10202 1727204064.24320: worker is 1 (out of 1 available) 10202 1727204064.24335: exiting _queue_task() for managed-node3/command 10202 1727204064.24347: done queuing things up, now waiting for results queue to drain 10202 1727204064.24349: waiting for pending results... 10202 1727204064.24687: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 10202 1727204064.24742: in run() - task 127b8e07-fff9-0b04-2570-000000000446 10202 1727204064.24764: variable 'ansible_search_path' from source: unknown 10202 1727204064.24775: variable 'ansible_search_path' from source: unknown 10202 1727204064.24833: calling self._execute() 10202 1727204064.24952: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.24968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.24985: variable 'omit' from source: magic vars 10202 1727204064.25442: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.25546: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.25609: variable 'profile_stat' from source: set_fact 10202 1727204064.25633: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204064.25642: when evaluation is False, skipping this task 10202 1727204064.25654: _execute() done 10202 1727204064.25667: dumping result to json 10202 1727204064.25679: done dumping result, returning 10202 1727204064.25691: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [127b8e07-fff9-0b04-2570-000000000446] 10202 1727204064.25702: sending task result for task 127b8e07-fff9-0b04-2570-000000000446 10202 1727204064.25922: done sending task result for task 127b8e07-fff9-0b04-2570-000000000446 10202 1727204064.25926: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204064.25989: no more pending results, returning what we have 10202 1727204064.25993: results queue empty 10202 1727204064.25994: checking for any_errors_fatal 10202 1727204064.26002: done checking for any_errors_fatal 10202 1727204064.26003: checking for max_fail_percentage 10202 1727204064.26004: done checking for max_fail_percentage 10202 1727204064.26006: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.26007: done checking to see if all hosts have failed 10202 1727204064.26008: getting the remaining hosts for this loop 10202 1727204064.26010: done getting the remaining hosts for this loop 10202 1727204064.26016: getting the next task for host managed-node3 10202 1727204064.26024: done getting next task for host managed-node3 10202 1727204064.26027: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10202 1727204064.26035: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.26045: getting variables 10202 1727204064.26047: in VariableManager get_vars() 10202 1727204064.26098: Calling all_inventory to load vars for managed-node3 10202 1727204064.26102: Calling groups_inventory to load vars for managed-node3 10202 1727204064.26105: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.26120: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.26124: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.26130: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.28358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.30610: done with get_vars() 10202 1727204064.30653: done getting variables 10202 1727204064.30734: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.30868: variable 'profile' from source: include params 10202 1727204064.30873: variable 'item' from source: include params 10202 1727204064.30944: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.071) 0:00:25.985 ***** 10202 1727204064.30983: entering _queue_task() for managed-node3/set_fact 10202 1727204064.31421: worker is 1 (out of 1 available) 10202 1727204064.31438: exiting _queue_task() for managed-node3/set_fact 10202 1727204064.31453: done queuing things up, now waiting for results queue to drain 10202 1727204064.31454: waiting for pending results... 10202 1727204064.31884: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 10202 1727204064.32014: in run() - task 127b8e07-fff9-0b04-2570-000000000447 10202 1727204064.32018: variable 'ansible_search_path' from source: unknown 10202 1727204064.32022: variable 'ansible_search_path' from source: unknown 10202 1727204064.32025: calling self._execute() 10202 1727204064.32109: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.32132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.32149: variable 'omit' from source: magic vars 10202 1727204064.32584: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.32604: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.32755: variable 'profile_stat' from source: set_fact 10202 1727204064.32784: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204064.32793: when evaluation is False, skipping this task 10202 1727204064.32801: _execute() done 10202 1727204064.32810: dumping result to json 10202 1727204064.32819: done dumping result, returning 10202 1727204064.32871: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [127b8e07-fff9-0b04-2570-000000000447] 10202 1727204064.32874: sending task result for task 127b8e07-fff9-0b04-2570-000000000447 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204064.33042: no more pending results, returning what we have 10202 1727204064.33046: results queue empty 10202 1727204064.33048: checking for any_errors_fatal 10202 1727204064.33056: done checking for any_errors_fatal 10202 1727204064.33057: checking for max_fail_percentage 10202 1727204064.33059: done checking for max_fail_percentage 10202 1727204064.33060: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.33061: done checking to see if all hosts have failed 10202 1727204064.33062: getting the remaining hosts for this loop 10202 1727204064.33064: done getting the remaining hosts for this loop 10202 1727204064.33071: getting the next task for host managed-node3 10202 1727204064.33081: done getting next task for host managed-node3 10202 1727204064.33084: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10202 1727204064.33090: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.33095: getting variables 10202 1727204064.33097: in VariableManager get_vars() 10202 1727204064.33149: Calling all_inventory to load vars for managed-node3 10202 1727204064.33153: Calling groups_inventory to load vars for managed-node3 10202 1727204064.33155: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.33289: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.33294: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.33372: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.33955: done sending task result for task 127b8e07-fff9-0b04-2570-000000000447 10202 1727204064.33959: WORKER PROCESS EXITING 10202 1727204064.36434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.41334: done with get_vars() 10202 1727204064.41373: done getting variables 10202 1727204064.41449: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.41583: variable 'profile' from source: include params 10202 1727204064.41587: variable 'item' from source: include params 10202 1727204064.41660: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.107) 0:00:26.092 ***** 10202 1727204064.41695: entering _queue_task() for managed-node3/command 10202 1727204064.42207: worker is 1 (out of 1 available) 10202 1727204064.42220: exiting _queue_task() for managed-node3/command 10202 1727204064.42234: done queuing things up, now waiting for results queue to drain 10202 1727204064.42235: waiting for pending results... 10202 1727204064.42456: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 10202 1727204064.42609: in run() - task 127b8e07-fff9-0b04-2570-000000000448 10202 1727204064.42640: variable 'ansible_search_path' from source: unknown 10202 1727204064.42648: variable 'ansible_search_path' from source: unknown 10202 1727204064.42699: calling self._execute() 10202 1727204064.42820: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.42844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.42862: variable 'omit' from source: magic vars 10202 1727204064.43313: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.43339: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.43489: variable 'profile_stat' from source: set_fact 10202 1727204064.43511: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204064.43553: when evaluation is False, skipping this task 10202 1727204064.43557: _execute() done 10202 1727204064.43560: dumping result to json 10202 1727204064.43562: done dumping result, returning 10202 1727204064.43567: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [127b8e07-fff9-0b04-2570-000000000448] 10202 1727204064.43570: sending task result for task 127b8e07-fff9-0b04-2570-000000000448 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204064.43831: no more pending results, returning what we have 10202 1727204064.43836: results queue empty 10202 1727204064.43838: checking for any_errors_fatal 10202 1727204064.43846: done checking for any_errors_fatal 10202 1727204064.43847: checking for max_fail_percentage 10202 1727204064.43849: done checking for max_fail_percentage 10202 1727204064.43850: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.43851: done checking to see if all hosts have failed 10202 1727204064.43852: getting the remaining hosts for this loop 10202 1727204064.43854: done getting the remaining hosts for this loop 10202 1727204064.43859: getting the next task for host managed-node3 10202 1727204064.43869: done getting next task for host managed-node3 10202 1727204064.43873: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10202 1727204064.43878: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.43884: getting variables 10202 1727204064.43886: in VariableManager get_vars() 10202 1727204064.44052: Calling all_inventory to load vars for managed-node3 10202 1727204064.44056: Calling groups_inventory to load vars for managed-node3 10202 1727204064.44059: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.44068: done sending task result for task 127b8e07-fff9-0b04-2570-000000000448 10202 1727204064.44072: WORKER PROCESS EXITING 10202 1727204064.44087: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.44090: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.44094: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.46031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.50073: done with get_vars() 10202 1727204064.50237: done getting variables 10202 1727204064.50305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.50685: variable 'profile' from source: include params 10202 1727204064.50689: variable 'item' from source: include params 10202 1727204064.50763: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.091) 0:00:26.183 ***** 10202 1727204064.50801: entering _queue_task() for managed-node3/set_fact 10202 1727204064.51416: worker is 1 (out of 1 available) 10202 1727204064.51431: exiting _queue_task() for managed-node3/set_fact 10202 1727204064.51443: done queuing things up, now waiting for results queue to drain 10202 1727204064.51445: waiting for pending results... 10202 1727204064.51634: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 10202 1727204064.51784: in run() - task 127b8e07-fff9-0b04-2570-000000000449 10202 1727204064.51806: variable 'ansible_search_path' from source: unknown 10202 1727204064.51813: variable 'ansible_search_path' from source: unknown 10202 1727204064.51892: calling self._execute() 10202 1727204064.51982: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.52000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.52018: variable 'omit' from source: magic vars 10202 1727204064.52451: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.52500: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.52620: variable 'profile_stat' from source: set_fact 10202 1727204064.52643: Evaluated conditional (profile_stat.stat.exists): False 10202 1727204064.52656: when evaluation is False, skipping this task 10202 1727204064.52717: _execute() done 10202 1727204064.52720: dumping result to json 10202 1727204064.52723: done dumping result, returning 10202 1727204064.52726: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [127b8e07-fff9-0b04-2570-000000000449] 10202 1727204064.52731: sending task result for task 127b8e07-fff9-0b04-2570-000000000449 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10202 1727204064.53078: no more pending results, returning what we have 10202 1727204064.53082: results queue empty 10202 1727204064.53084: checking for any_errors_fatal 10202 1727204064.53092: done checking for any_errors_fatal 10202 1727204064.53093: checking for max_fail_percentage 10202 1727204064.53094: done checking for max_fail_percentage 10202 1727204064.53096: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.53097: done checking to see if all hosts have failed 10202 1727204064.53097: getting the remaining hosts for this loop 10202 1727204064.53100: done getting the remaining hosts for this loop 10202 1727204064.53105: getting the next task for host managed-node3 10202 1727204064.53115: done getting next task for host managed-node3 10202 1727204064.53119: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10202 1727204064.53123: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.53131: getting variables 10202 1727204064.53134: in VariableManager get_vars() 10202 1727204064.53314: Calling all_inventory to load vars for managed-node3 10202 1727204064.53318: Calling groups_inventory to load vars for managed-node3 10202 1727204064.53321: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.53337: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.53340: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.53344: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.53984: done sending task result for task 127b8e07-fff9-0b04-2570-000000000449 10202 1727204064.53987: WORKER PROCESS EXITING 10202 1727204064.55752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.59217: done with get_vars() 10202 1727204064.59263: done getting variables 10202 1727204064.59331: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.59467: variable 'profile' from source: include params 10202 1727204064.59471: variable 'item' from source: include params 10202 1727204064.59547: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.087) 0:00:26.271 ***** 10202 1727204064.59583: entering _queue_task() for managed-node3/assert 10202 1727204064.60068: worker is 1 (out of 1 available) 10202 1727204064.60081: exiting _queue_task() for managed-node3/assert 10202 1727204064.60094: done queuing things up, now waiting for results queue to drain 10202 1727204064.60095: waiting for pending results... 10202 1727204064.60343: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' 10202 1727204064.60478: in run() - task 127b8e07-fff9-0b04-2570-00000000026e 10202 1727204064.60501: variable 'ansible_search_path' from source: unknown 10202 1727204064.60508: variable 'ansible_search_path' from source: unknown 10202 1727204064.60566: calling self._execute() 10202 1727204064.60688: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.60703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.60725: variable 'omit' from source: magic vars 10202 1727204064.61677: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.61681: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.61683: variable 'omit' from source: magic vars 10202 1727204064.61685: variable 'omit' from source: magic vars 10202 1727204064.61935: variable 'profile' from source: include params 10202 1727204064.61948: variable 'item' from source: include params 10202 1727204064.62119: variable 'item' from source: include params 10202 1727204064.62158: variable 'omit' from source: magic vars 10202 1727204064.62279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204064.62449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204064.62482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204064.62510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.62525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.62586: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204064.62664: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.62675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.62920: Set connection var ansible_shell_type to sh 10202 1727204064.62946: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204064.62957: Set connection var ansible_connection to ssh 10202 1727204064.62970: Set connection var ansible_shell_executable to /bin/sh 10202 1727204064.62993: Set connection var ansible_pipelining to False 10202 1727204064.63005: Set connection var ansible_timeout to 10 10202 1727204064.63048: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.63057: variable 'ansible_connection' from source: unknown 10202 1727204064.63068: variable 'ansible_module_compression' from source: unknown 10202 1727204064.63080: variable 'ansible_shell_type' from source: unknown 10202 1727204064.63171: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.63175: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.63177: variable 'ansible_pipelining' from source: unknown 10202 1727204064.63180: variable 'ansible_timeout' from source: unknown 10202 1727204064.63182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.63316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204064.63338: variable 'omit' from source: magic vars 10202 1727204064.63347: starting attempt loop 10202 1727204064.63353: running the handler 10202 1727204064.63538: variable 'lsr_net_profile_exists' from source: set_fact 10202 1727204064.63550: Evaluated conditional (lsr_net_profile_exists): True 10202 1727204064.63563: handler run complete 10202 1727204064.63586: attempt loop complete, returning result 10202 1727204064.63593: _execute() done 10202 1727204064.63601: dumping result to json 10202 1727204064.63609: done dumping result, returning 10202 1727204064.63622: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' [127b8e07-fff9-0b04-2570-00000000026e] 10202 1727204064.63644: sending task result for task 127b8e07-fff9-0b04-2570-00000000026e 10202 1727204064.63822: done sending task result for task 127b8e07-fff9-0b04-2570-00000000026e 10202 1727204064.63825: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204064.63912: no more pending results, returning what we have 10202 1727204064.63916: results queue empty 10202 1727204064.63917: checking for any_errors_fatal 10202 1727204064.63927: done checking for any_errors_fatal 10202 1727204064.63930: checking for max_fail_percentage 10202 1727204064.63932: done checking for max_fail_percentage 10202 1727204064.63933: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.63935: done checking to see if all hosts have failed 10202 1727204064.63935: getting the remaining hosts for this loop 10202 1727204064.63937: done getting the remaining hosts for this loop 10202 1727204064.63942: getting the next task for host managed-node3 10202 1727204064.63949: done getting next task for host managed-node3 10202 1727204064.63952: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10202 1727204064.63956: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.63961: getting variables 10202 1727204064.63963: in VariableManager get_vars() 10202 1727204064.64013: Calling all_inventory to load vars for managed-node3 10202 1727204064.64016: Calling groups_inventory to load vars for managed-node3 10202 1727204064.64018: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.64033: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.64036: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.64038: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.66892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.70272: done with get_vars() 10202 1727204064.70303: done getting variables 10202 1727204064.70384: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.70514: variable 'profile' from source: include params 10202 1727204064.70517: variable 'item' from source: include params 10202 1727204064.70592: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.110) 0:00:26.381 ***** 10202 1727204064.70636: entering _queue_task() for managed-node3/assert 10202 1727204064.71049: worker is 1 (out of 1 available) 10202 1727204064.71062: exiting _queue_task() for managed-node3/assert 10202 1727204064.71080: done queuing things up, now waiting for results queue to drain 10202 1727204064.71082: waiting for pending results... 10202 1727204064.71488: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 10202 1727204064.71641: in run() - task 127b8e07-fff9-0b04-2570-00000000026f 10202 1727204064.71645: variable 'ansible_search_path' from source: unknown 10202 1727204064.71649: variable 'ansible_search_path' from source: unknown 10202 1727204064.71652: calling self._execute() 10202 1727204064.71776: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.71791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.71806: variable 'omit' from source: magic vars 10202 1727204064.72242: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.72261: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.72275: variable 'omit' from source: magic vars 10202 1727204064.72324: variable 'omit' from source: magic vars 10202 1727204064.72449: variable 'profile' from source: include params 10202 1727204064.72459: variable 'item' from source: include params 10202 1727204064.72538: variable 'item' from source: include params 10202 1727204064.72618: variable 'omit' from source: magic vars 10202 1727204064.72621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204064.72662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204064.72689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204064.72711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.72734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.72769: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204064.72777: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.72784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.72907: Set connection var ansible_shell_type to sh 10202 1727204064.72920: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204064.72945: Set connection var ansible_connection to ssh 10202 1727204064.72950: Set connection var ansible_shell_executable to /bin/sh 10202 1727204064.73055: Set connection var ansible_pipelining to False 10202 1727204064.73059: Set connection var ansible_timeout to 10 10202 1727204064.73061: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.73064: variable 'ansible_connection' from source: unknown 10202 1727204064.73069: variable 'ansible_module_compression' from source: unknown 10202 1727204064.73071: variable 'ansible_shell_type' from source: unknown 10202 1727204064.73073: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.73075: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.73077: variable 'ansible_pipelining' from source: unknown 10202 1727204064.73079: variable 'ansible_timeout' from source: unknown 10202 1727204064.73081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.73219: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204064.73240: variable 'omit' from source: magic vars 10202 1727204064.73250: starting attempt loop 10202 1727204064.73258: running the handler 10202 1727204064.73399: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10202 1727204064.73410: Evaluated conditional (lsr_net_profile_ansible_managed): True 10202 1727204064.73422: handler run complete 10202 1727204064.73447: attempt loop complete, returning result 10202 1727204064.73456: _execute() done 10202 1727204064.73468: dumping result to json 10202 1727204064.73479: done dumping result, returning 10202 1727204064.73500: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [127b8e07-fff9-0b04-2570-00000000026f] 10202 1727204064.73601: sending task result for task 127b8e07-fff9-0b04-2570-00000000026f 10202 1727204064.73689: done sending task result for task 127b8e07-fff9-0b04-2570-00000000026f 10202 1727204064.73692: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204064.73750: no more pending results, returning what we have 10202 1727204064.73753: results queue empty 10202 1727204064.73755: checking for any_errors_fatal 10202 1727204064.73763: done checking for any_errors_fatal 10202 1727204064.73764: checking for max_fail_percentage 10202 1727204064.73767: done checking for max_fail_percentage 10202 1727204064.73769: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.73770: done checking to see if all hosts have failed 10202 1727204064.73771: getting the remaining hosts for this loop 10202 1727204064.73773: done getting the remaining hosts for this loop 10202 1727204064.73778: getting the next task for host managed-node3 10202 1727204064.73785: done getting next task for host managed-node3 10202 1727204064.73789: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10202 1727204064.73793: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.73798: getting variables 10202 1727204064.73800: in VariableManager get_vars() 10202 1727204064.73852: Calling all_inventory to load vars for managed-node3 10202 1727204064.73856: Calling groups_inventory to load vars for managed-node3 10202 1727204064.73858: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.73989: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.73993: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.73997: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.75982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.78176: done with get_vars() 10202 1727204064.78215: done getting variables 10202 1727204064.78294: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204064.78423: variable 'profile' from source: include params 10202 1727204064.78427: variable 'item' from source: include params 10202 1727204064.78499: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.079) 0:00:26.460 ***** 10202 1727204064.78541: entering _queue_task() for managed-node3/assert 10202 1727204064.78948: worker is 1 (out of 1 available) 10202 1727204064.78970: exiting _queue_task() for managed-node3/assert 10202 1727204064.78987: done queuing things up, now waiting for results queue to drain 10202 1727204064.78988: waiting for pending results... 10202 1727204064.79392: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 10202 1727204064.79398: in run() - task 127b8e07-fff9-0b04-2570-000000000270 10202 1727204064.79401: variable 'ansible_search_path' from source: unknown 10202 1727204064.79404: variable 'ansible_search_path' from source: unknown 10202 1727204064.79490: calling self._execute() 10202 1727204064.79530: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.79534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.79595: variable 'omit' from source: magic vars 10202 1727204064.79932: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.79943: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.79951: variable 'omit' from source: magic vars 10202 1727204064.79997: variable 'omit' from source: magic vars 10202 1727204064.80109: variable 'profile' from source: include params 10202 1727204064.80113: variable 'item' from source: include params 10202 1727204064.80248: variable 'item' from source: include params 10202 1727204064.80252: variable 'omit' from source: magic vars 10202 1727204064.80254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204064.80282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204064.80303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204064.80322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.80335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.80369: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204064.80373: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.80376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.80482: Set connection var ansible_shell_type to sh 10202 1727204064.80488: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204064.80495: Set connection var ansible_connection to ssh 10202 1727204064.80575: Set connection var ansible_shell_executable to /bin/sh 10202 1727204064.80578: Set connection var ansible_pipelining to False 10202 1727204064.80581: Set connection var ansible_timeout to 10 10202 1727204064.80583: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.80585: variable 'ansible_connection' from source: unknown 10202 1727204064.80587: variable 'ansible_module_compression' from source: unknown 10202 1727204064.80590: variable 'ansible_shell_type' from source: unknown 10202 1727204064.80592: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.80594: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.80596: variable 'ansible_pipelining' from source: unknown 10202 1727204064.80599: variable 'ansible_timeout' from source: unknown 10202 1727204064.80601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.80709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204064.80722: variable 'omit' from source: magic vars 10202 1727204064.80729: starting attempt loop 10202 1727204064.80733: running the handler 10202 1727204064.80850: variable 'lsr_net_profile_fingerprint' from source: set_fact 10202 1727204064.80854: Evaluated conditional (lsr_net_profile_fingerprint): True 10202 1727204064.80861: handler run complete 10202 1727204064.80902: attempt loop complete, returning result 10202 1727204064.80905: _execute() done 10202 1727204064.80907: dumping result to json 10202 1727204064.80910: done dumping result, returning 10202 1727204064.80912: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 [127b8e07-fff9-0b04-2570-000000000270] 10202 1727204064.80915: sending task result for task 127b8e07-fff9-0b04-2570-000000000270 10202 1727204064.81079: done sending task result for task 127b8e07-fff9-0b04-2570-000000000270 10202 1727204064.81082: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10202 1727204064.81161: no more pending results, returning what we have 10202 1727204064.81163: results queue empty 10202 1727204064.81164: checking for any_errors_fatal 10202 1727204064.81171: done checking for any_errors_fatal 10202 1727204064.81172: checking for max_fail_percentage 10202 1727204064.81173: done checking for max_fail_percentage 10202 1727204064.81174: checking to see if all hosts have failed and the running result is not ok 10202 1727204064.81175: done checking to see if all hosts have failed 10202 1727204064.81176: getting the remaining hosts for this loop 10202 1727204064.81177: done getting the remaining hosts for this loop 10202 1727204064.81181: getting the next task for host managed-node3 10202 1727204064.81188: done getting next task for host managed-node3 10202 1727204064.81190: ^ task is: TASK: ** TEST check polling interval 10202 1727204064.81192: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204064.81196: getting variables 10202 1727204064.81197: in VariableManager get_vars() 10202 1727204064.81243: Calling all_inventory to load vars for managed-node3 10202 1727204064.81246: Calling groups_inventory to load vars for managed-node3 10202 1727204064.81249: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204064.81259: Calling all_plugins_play to load vars for managed-node3 10202 1727204064.81262: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204064.81265: Calling groups_plugins_play to load vars for managed-node3 10202 1727204064.83149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204064.85412: done with get_vars() 10202 1727204064.85455: done getting variables 10202 1727204064.85532: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.070) 0:00:26.531 ***** 10202 1727204064.85564: entering _queue_task() for managed-node3/command 10202 1727204064.85985: worker is 1 (out of 1 available) 10202 1727204064.85998: exiting _queue_task() for managed-node3/command 10202 1727204064.86013: done queuing things up, now waiting for results queue to drain 10202 1727204064.86014: waiting for pending results... 10202 1727204064.86376: running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval 10202 1727204064.86672: in run() - task 127b8e07-fff9-0b04-2570-000000000071 10202 1727204064.86676: variable 'ansible_search_path' from source: unknown 10202 1727204064.86680: calling self._execute() 10202 1727204064.86856: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.87172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.87175: variable 'omit' from source: magic vars 10202 1727204064.87836: variable 'ansible_distribution_major_version' from source: facts 10202 1727204064.87903: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204064.87918: variable 'omit' from source: magic vars 10202 1727204064.87987: variable 'omit' from source: magic vars 10202 1727204064.88320: variable 'controller_device' from source: play vars 10202 1727204064.88333: variable 'omit' from source: magic vars 10202 1727204064.88645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204064.88649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204064.88652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204064.88654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.88657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204064.88758: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204064.88769: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.88838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.89089: Set connection var ansible_shell_type to sh 10202 1727204064.89102: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204064.89113: Set connection var ansible_connection to ssh 10202 1727204064.89165: Set connection var ansible_shell_executable to /bin/sh 10202 1727204064.89183: Set connection var ansible_pipelining to False 10202 1727204064.89195: Set connection var ansible_timeout to 10 10202 1727204064.89352: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.89356: variable 'ansible_connection' from source: unknown 10202 1727204064.89359: variable 'ansible_module_compression' from source: unknown 10202 1727204064.89361: variable 'ansible_shell_type' from source: unknown 10202 1727204064.89363: variable 'ansible_shell_executable' from source: unknown 10202 1727204064.89368: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204064.89370: variable 'ansible_pipelining' from source: unknown 10202 1727204064.89373: variable 'ansible_timeout' from source: unknown 10202 1727204064.89375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204064.89660: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204064.89711: variable 'omit' from source: magic vars 10202 1727204064.89714: starting attempt loop 10202 1727204064.89717: running the handler 10202 1727204064.89733: _low_level_execute_command(): starting 10202 1727204064.89748: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204064.91171: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204064.91178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204064.91423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204064.91440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204064.91581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204064.93447: stdout chunk (state=3): >>>/root <<< 10202 1727204064.93617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204064.93801: stderr chunk (state=3): >>><<< 10202 1727204064.93805: stdout chunk (state=3): >>><<< 10202 1727204064.94078: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204064.94083: _low_level_execute_command(): starting 10202 1727204064.94087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878 `" && echo ansible-tmp-1727204064.9388092-11691-248533124422878="` echo /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878 `" ) && sleep 0' 10202 1727204064.94634: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204064.94642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204064.94660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204064.94691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204064.94801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204064.94819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204064.94822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204064.94825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204064.94895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204064.97074: stdout chunk (state=3): >>>ansible-tmp-1727204064.9388092-11691-248533124422878=/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878 <<< 10202 1727204064.97193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204064.97286: stderr chunk (state=3): >>><<< 10202 1727204064.97289: stdout chunk (state=3): >>><<< 10202 1727204064.97392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204064.9388092-11691-248533124422878=/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204064.97396: variable 'ansible_module_compression' from source: unknown 10202 1727204064.97399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204064.97433: variable 'ansible_facts' from source: unknown 10202 1727204064.97529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py 10202 1727204064.97690: Sending initial data 10202 1727204064.97693: Sent initial data (156 bytes) 10202 1727204064.98321: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204064.98472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204064.98477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204064.98479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204064.98494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204064.98600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.00374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204065.00430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204065.00497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3io5idlm /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py <<< 10202 1727204065.00500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py" <<< 10202 1727204065.00566: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp3io5idlm" to remote "/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py" <<< 10202 1727204065.01259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.01305: stderr chunk (state=3): >>><<< 10202 1727204065.01309: stdout chunk (state=3): >>><<< 10202 1727204065.01332: done transferring module to remote 10202 1727204065.01340: _low_level_execute_command(): starting 10202 1727204065.01345: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/ /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py && sleep 0' 10202 1727204065.01942: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.01967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.02001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.02075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.04107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.04176: stderr chunk (state=3): >>><<< 10202 1727204065.04181: stdout chunk (state=3): >>><<< 10202 1727204065.04191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.04195: _low_level_execute_command(): starting 10202 1727204065.04201: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/AnsiballZ_command.py && sleep 0' 10202 1727204065.04715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.04721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204065.04724: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204065.04727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.04783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.04813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.04911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.22894: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:54:25.223634", "end": "2024-09-24 14:54:25.227556", "delta": "0:00:00.003922", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204065.25035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204065.25039: stdout chunk (state=3): >>><<< 10202 1727204065.25041: stderr chunk (state=3): >>><<< 10202 1727204065.25044: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:54:25.223634", "end": "2024-09-24 14:54:25.227556", "delta": "0:00:00.003922", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204065.25046: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204065.25049: _low_level_execute_command(): starting 10202 1727204065.25051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204064.9388092-11691-248533124422878/ > /dev/null 2>&1 && sleep 0' 10202 1727204065.25705: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.25803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.25853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.25924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.28007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.28079: stderr chunk (state=3): >>><<< 10202 1727204065.28083: stdout chunk (state=3): >>><<< 10202 1727204065.28092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.28102: handler run complete 10202 1727204065.28123: Evaluated conditional (False): False 10202 1727204065.28264: variable 'result' from source: unknown 10202 1727204065.28280: Evaluated conditional ('110' in result.stdout): True 10202 1727204065.28291: attempt loop complete, returning result 10202 1727204065.28294: _execute() done 10202 1727204065.28296: dumping result to json 10202 1727204065.28302: done dumping result, returning 10202 1727204065.28311: done running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval [127b8e07-fff9-0b04-2570-000000000071] 10202 1727204065.28319: sending task result for task 127b8e07-fff9-0b04-2570-000000000071 10202 1727204065.28423: done sending task result for task 127b8e07-fff9-0b04-2570-000000000071 10202 1727204065.28426: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003922", "end": "2024-09-24 14:54:25.227556", "rc": 0, "start": "2024-09-24 14:54:25.223634" } STDOUT: MII Polling Interval (ms): 110 10202 1727204065.28510: no more pending results, returning what we have 10202 1727204065.28514: results queue empty 10202 1727204065.28515: checking for any_errors_fatal 10202 1727204065.28522: done checking for any_errors_fatal 10202 1727204065.28523: checking for max_fail_percentage 10202 1727204065.28525: done checking for max_fail_percentage 10202 1727204065.28526: checking to see if all hosts have failed and the running result is not ok 10202 1727204065.28527: done checking to see if all hosts have failed 10202 1727204065.28530: getting the remaining hosts for this loop 10202 1727204065.28532: done getting the remaining hosts for this loop 10202 1727204065.28536: getting the next task for host managed-node3 10202 1727204065.28543: done getting next task for host managed-node3 10202 1727204065.28546: ^ task is: TASK: ** TEST check IPv4 10202 1727204065.28548: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204065.28551: getting variables 10202 1727204065.28552: in VariableManager get_vars() 10202 1727204065.28595: Calling all_inventory to load vars for managed-node3 10202 1727204065.28598: Calling groups_inventory to load vars for managed-node3 10202 1727204065.28600: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204065.28611: Calling all_plugins_play to load vars for managed-node3 10202 1727204065.28614: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204065.28617: Calling groups_plugins_play to load vars for managed-node3 10202 1727204065.29630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204065.31661: done with get_vars() 10202 1727204065.31697: done getting variables 10202 1727204065.31768: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.462) 0:00:26.993 ***** 10202 1727204065.31800: entering _queue_task() for managed-node3/command 10202 1727204065.32204: worker is 1 (out of 1 available) 10202 1727204065.32219: exiting _queue_task() for managed-node3/command 10202 1727204065.32236: done queuing things up, now waiting for results queue to drain 10202 1727204065.32238: waiting for pending results... 10202 1727204065.32689: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 10202 1727204065.32694: in run() - task 127b8e07-fff9-0b04-2570-000000000072 10202 1727204065.32698: variable 'ansible_search_path' from source: unknown 10202 1727204065.32740: calling self._execute() 10202 1727204065.32852: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.32872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.32893: variable 'omit' from source: magic vars 10202 1727204065.33209: variable 'ansible_distribution_major_version' from source: facts 10202 1727204065.33222: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204065.33232: variable 'omit' from source: magic vars 10202 1727204065.33247: variable 'omit' from source: magic vars 10202 1727204065.33320: variable 'controller_device' from source: play vars 10202 1727204065.33338: variable 'omit' from source: magic vars 10202 1727204065.33378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204065.33408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204065.33424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204065.33440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204065.33451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204065.33481: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204065.33485: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.33488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.33564: Set connection var ansible_shell_type to sh 10202 1727204065.33570: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204065.33578: Set connection var ansible_connection to ssh 10202 1727204065.33583: Set connection var ansible_shell_executable to /bin/sh 10202 1727204065.33588: Set connection var ansible_pipelining to False 10202 1727204065.33595: Set connection var ansible_timeout to 10 10202 1727204065.33616: variable 'ansible_shell_executable' from source: unknown 10202 1727204065.33619: variable 'ansible_connection' from source: unknown 10202 1727204065.33622: variable 'ansible_module_compression' from source: unknown 10202 1727204065.33624: variable 'ansible_shell_type' from source: unknown 10202 1727204065.33626: variable 'ansible_shell_executable' from source: unknown 10202 1727204065.33632: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.33634: variable 'ansible_pipelining' from source: unknown 10202 1727204065.33636: variable 'ansible_timeout' from source: unknown 10202 1727204065.33640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.33755: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204065.33767: variable 'omit' from source: magic vars 10202 1727204065.33772: starting attempt loop 10202 1727204065.33775: running the handler 10202 1727204065.33789: _low_level_execute_command(): starting 10202 1727204065.33798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204065.34574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.34579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.34582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.34585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204065.34588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204065.34590: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204065.34592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.34600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204065.34609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204065.34615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204065.34640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.34643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.34742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.34746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.34748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.34801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.35094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.36817: stdout chunk (state=3): >>>/root <<< 10202 1727204065.36952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.36995: stderr chunk (state=3): >>><<< 10202 1727204065.36997: stdout chunk (state=3): >>><<< 10202 1727204065.37012: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.37030: _low_level_execute_command(): starting 10202 1727204065.37073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390 `" && echo ansible-tmp-1727204065.370173-11711-26125934761390="` echo /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390 `" ) && sleep 0' 10202 1727204065.37792: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.38044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.40151: stdout chunk (state=3): >>>ansible-tmp-1727204065.370173-11711-26125934761390=/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390 <<< 10202 1727204065.40352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.40378: stdout chunk (state=3): >>><<< 10202 1727204065.40393: stderr chunk (state=3): >>><<< 10202 1727204065.40420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.370173-11711-26125934761390=/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.40464: variable 'ansible_module_compression' from source: unknown 10202 1727204065.40536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204065.40588: variable 'ansible_facts' from source: unknown 10202 1727204065.40673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py 10202 1727204065.40937: Sending initial data 10202 1727204065.40941: Sent initial data (154 bytes) 10202 1727204065.41751: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.41776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.41814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.41898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.43717: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204065.43798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204065.43870: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpih5ft6g8 /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py <<< 10202 1727204065.43874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py" <<< 10202 1727204065.43935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpih5ft6g8" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py" <<< 10202 1727204065.44861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.44901: stderr chunk (state=3): >>><<< 10202 1727204065.44909: stdout chunk (state=3): >>><<< 10202 1727204065.44943: done transferring module to remote 10202 1727204065.45042: _low_level_execute_command(): starting 10202 1727204065.45046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/ /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py && sleep 0' 10202 1727204065.45667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.45685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.45701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.45736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.45835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.45862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.45978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.48185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.48190: stdout chunk (state=3): >>><<< 10202 1727204065.48193: stderr chunk (state=3): >>><<< 10202 1727204065.48275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.48279: _low_level_execute_command(): starting 10202 1727204065.48283: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/AnsiballZ_command.py && sleep 0' 10202 1727204065.48942: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.48947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.48950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.48952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204065.48955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204065.48958: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204065.48960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.48962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204065.48965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204065.48969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204065.48971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.48991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.49003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204065.49012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204065.49018: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204065.49089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.49097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.49120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.49142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.49245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.67389: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.249/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 231sec preferred_lft 231sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:25.668306", "end": "2024-09-24 14:54:25.672573", "delta": "0:00:00.004267", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204065.69180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204065.69244: stderr chunk (state=3): >>><<< 10202 1727204065.69250: stdout chunk (state=3): >>><<< 10202 1727204065.69265: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.249/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 231sec preferred_lft 231sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:25.668306", "end": "2024-09-24 14:54:25.672573", "delta": "0:00:00.004267", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204065.69299: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204065.69307: _low_level_execute_command(): starting 10202 1727204065.69313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.370173-11711-26125934761390/ > /dev/null 2>&1 && sleep 0' 10202 1727204065.70013: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.70018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.70058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.70061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.70174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.72271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.72287: stderr chunk (state=3): >>><<< 10202 1727204065.72291: stdout chunk (state=3): >>><<< 10202 1727204065.72308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.72315: handler run complete 10202 1727204065.72335: Evaluated conditional (False): False 10202 1727204065.72457: variable 'result' from source: set_fact 10202 1727204065.72475: Evaluated conditional ('192.0.2' in result.stdout): True 10202 1727204065.72485: attempt loop complete, returning result 10202 1727204065.72493: _execute() done 10202 1727204065.72496: dumping result to json 10202 1727204065.72502: done dumping result, returning 10202 1727204065.72511: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 [127b8e07-fff9-0b04-2570-000000000072] 10202 1727204065.72517: sending task result for task 127b8e07-fff9-0b04-2570-000000000072 10202 1727204065.72624: done sending task result for task 127b8e07-fff9-0b04-2570-000000000072 10202 1727204065.72627: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.004267", "end": "2024-09-24 14:54:25.672573", "rc": 0, "start": "2024-09-24 14:54:25.668306" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.249/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 231sec preferred_lft 231sec 10202 1727204065.72709: no more pending results, returning what we have 10202 1727204065.72712: results queue empty 10202 1727204065.72714: checking for any_errors_fatal 10202 1727204065.72723: done checking for any_errors_fatal 10202 1727204065.72724: checking for max_fail_percentage 10202 1727204065.72725: done checking for max_fail_percentage 10202 1727204065.72726: checking to see if all hosts have failed and the running result is not ok 10202 1727204065.72727: done checking to see if all hosts have failed 10202 1727204065.72728: getting the remaining hosts for this loop 10202 1727204065.72730: done getting the remaining hosts for this loop 10202 1727204065.72734: getting the next task for host managed-node3 10202 1727204065.72741: done getting next task for host managed-node3 10202 1727204065.72744: ^ task is: TASK: ** TEST check IPv6 10202 1727204065.72746: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204065.72749: getting variables 10202 1727204065.72750: in VariableManager get_vars() 10202 1727204065.72794: Calling all_inventory to load vars for managed-node3 10202 1727204065.72797: Calling groups_inventory to load vars for managed-node3 10202 1727204065.72799: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204065.72811: Calling all_plugins_play to load vars for managed-node3 10202 1727204065.72814: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204065.72816: Calling groups_plugins_play to load vars for managed-node3 10202 1727204065.74273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204065.76824: done with get_vars() 10202 1727204065.76867: done getting variables 10202 1727204065.76935: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.451) 0:00:27.445 ***** 10202 1727204065.76970: entering _queue_task() for managed-node3/command 10202 1727204065.77350: worker is 1 (out of 1 available) 10202 1727204065.77364: exiting _queue_task() for managed-node3/command 10202 1727204065.77379: done queuing things up, now waiting for results queue to drain 10202 1727204065.77380: waiting for pending results... 10202 1727204065.77696: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 10202 1727204065.77792: in run() - task 127b8e07-fff9-0b04-2570-000000000073 10202 1727204065.77797: variable 'ansible_search_path' from source: unknown 10202 1727204065.77820: calling self._execute() 10202 1727204065.77937: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.77950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.77970: variable 'omit' from source: magic vars 10202 1727204065.78384: variable 'ansible_distribution_major_version' from source: facts 10202 1727204065.78442: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204065.78445: variable 'omit' from source: magic vars 10202 1727204065.78448: variable 'omit' from source: magic vars 10202 1727204065.78555: variable 'controller_device' from source: play vars 10202 1727204065.78582: variable 'omit' from source: magic vars 10202 1727204065.78631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204065.78681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204065.78705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204065.78768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204065.78771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204065.78784: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204065.78792: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.78799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.78910: Set connection var ansible_shell_type to sh 10202 1727204065.78922: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204065.78931: Set connection var ansible_connection to ssh 10202 1727204065.78941: Set connection var ansible_shell_executable to /bin/sh 10202 1727204065.78969: Set connection var ansible_pipelining to False 10202 1727204065.78972: Set connection var ansible_timeout to 10 10202 1727204065.78996: variable 'ansible_shell_executable' from source: unknown 10202 1727204065.79003: variable 'ansible_connection' from source: unknown 10202 1727204065.79010: variable 'ansible_module_compression' from source: unknown 10202 1727204065.79170: variable 'ansible_shell_type' from source: unknown 10202 1727204065.79174: variable 'ansible_shell_executable' from source: unknown 10202 1727204065.79176: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204065.79178: variable 'ansible_pipelining' from source: unknown 10202 1727204065.79180: variable 'ansible_timeout' from source: unknown 10202 1727204065.79182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204065.79203: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204065.79223: variable 'omit' from source: magic vars 10202 1727204065.79233: starting attempt loop 10202 1727204065.79241: running the handler 10202 1727204065.79261: _low_level_execute_command(): starting 10202 1727204065.79277: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204065.80049: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.80075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.80186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.80270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.80386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.82261: stdout chunk (state=3): >>>/root <<< 10202 1727204065.82470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.82475: stdout chunk (state=3): >>><<< 10202 1727204065.82477: stderr chunk (state=3): >>><<< 10202 1727204065.82497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.82515: _low_level_execute_command(): starting 10202 1727204065.82606: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680 `" && echo ansible-tmp-1727204065.825035-11733-222982675446680="` echo /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680 `" ) && sleep 0' 10202 1727204065.83212: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.83227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.83245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.83268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204065.83331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.83401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.83429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.83454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.83567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.85931: stdout chunk (state=3): >>>ansible-tmp-1727204065.825035-11733-222982675446680=/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680 <<< 10202 1727204065.86173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.86177: stdout chunk (state=3): >>><<< 10202 1727204065.86179: stderr chunk (state=3): >>><<< 10202 1727204065.86182: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.825035-11733-222982675446680=/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.86216: variable 'ansible_module_compression' from source: unknown 10202 1727204065.86350: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204065.86354: variable 'ansible_facts' from source: unknown 10202 1727204065.86393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py 10202 1727204065.86648: Sending initial data 10202 1727204065.86651: Sent initial data (155 bytes) 10202 1727204065.87217: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.87226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.87240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.87280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204065.87287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.87413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.87416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.87492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.89371: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204065.89477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204065.89559: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7vrqgo9_ /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py <<< 10202 1727204065.89563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py" <<< 10202 1727204065.89644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7vrqgo9_" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py" <<< 10202 1727204065.90776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.90790: stderr chunk (state=3): >>><<< 10202 1727204065.90798: stdout chunk (state=3): >>><<< 10202 1727204065.90974: done transferring module to remote 10202 1727204065.90977: _low_level_execute_command(): starting 10202 1727204065.90981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/ /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py && sleep 0' 10202 1727204065.91595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.91612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204065.91627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204065.91649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204065.91668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204065.91780: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.91805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.91927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204065.94575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204065.94579: stdout chunk (state=3): >>><<< 10202 1727204065.94582: stderr chunk (state=3): >>><<< 10202 1727204065.94584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204065.94587: _low_level_execute_command(): starting 10202 1727204065.94590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/AnsiballZ_command.py && sleep 0' 10202 1727204065.95589: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204065.95675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204065.95706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204065.95718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204065.95738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204065.95909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.14162: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::169/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::306a:47ff:fec0:41e6/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::306a:47ff:fec0:41e6/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:26.136154", "end": "2024-09-24 14:54:26.140402", "delta": "0:00:00.004248", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204066.16173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204066.16178: stdout chunk (state=3): >>><<< 10202 1727204066.16180: stderr chunk (state=3): >>><<< 10202 1727204066.16201: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::169/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::306a:47ff:fec0:41e6/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::306a:47ff:fec0:41e6/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:26.136154", "end": "2024-09-24 14:54:26.140402", "delta": "0:00:00.004248", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204066.16246: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204066.16255: _low_level_execute_command(): starting 10202 1727204066.16260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.825035-11733-222982675446680/ > /dev/null 2>&1 && sleep 0' 10202 1727204066.17580: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.17584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.17723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.17784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.17885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.17957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.18032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.20208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204066.20212: stdout chunk (state=3): >>><<< 10202 1727204066.20215: stderr chunk (state=3): >>><<< 10202 1727204066.20355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204066.20360: handler run complete 10202 1727204066.20363: Evaluated conditional (False): False 10202 1727204066.20619: variable 'result' from source: set_fact 10202 1727204066.20646: Evaluated conditional ('2001' in result.stdout): True 10202 1727204066.20684: attempt loop complete, returning result 10202 1727204066.20688: _execute() done 10202 1727204066.20691: dumping result to json 10202 1727204066.20697: done dumping result, returning 10202 1727204066.20707: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 [127b8e07-fff9-0b04-2570-000000000073] 10202 1727204066.20713: sending task result for task 127b8e07-fff9-0b04-2570-000000000073 10202 1727204066.21092: done sending task result for task 127b8e07-fff9-0b04-2570-000000000073 10202 1727204066.21096: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.004248", "end": "2024-09-24 14:54:26.140402", "rc": 0, "start": "2024-09-24 14:54:26.136154" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::169/128 scope global dynamic noprefixroute valid_lft 233sec preferred_lft 233sec inet6 2001:db8::306a:47ff:fec0:41e6/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::306a:47ff:fec0:41e6/64 scope link noprefixroute valid_lft forever preferred_lft forever 10202 1727204066.21196: no more pending results, returning what we have 10202 1727204066.21200: results queue empty 10202 1727204066.21201: checking for any_errors_fatal 10202 1727204066.21212: done checking for any_errors_fatal 10202 1727204066.21212: checking for max_fail_percentage 10202 1727204066.21214: done checking for max_fail_percentage 10202 1727204066.21215: checking to see if all hosts have failed and the running result is not ok 10202 1727204066.21216: done checking to see if all hosts have failed 10202 1727204066.21217: getting the remaining hosts for this loop 10202 1727204066.21219: done getting the remaining hosts for this loop 10202 1727204066.21224: getting the next task for host managed-node3 10202 1727204066.21236: done getting next task for host managed-node3 10202 1727204066.21243: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10202 1727204066.21248: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204066.21390: getting variables 10202 1727204066.21393: in VariableManager get_vars() 10202 1727204066.21439: Calling all_inventory to load vars for managed-node3 10202 1727204066.21442: Calling groups_inventory to load vars for managed-node3 10202 1727204066.21444: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.21457: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.21460: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.21464: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.28728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.30862: done with get_vars() 10202 1727204066.30906: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.540) 0:00:27.985 ***** 10202 1727204066.31012: entering _queue_task() for managed-node3/include_tasks 10202 1727204066.31592: worker is 1 (out of 1 available) 10202 1727204066.31604: exiting _queue_task() for managed-node3/include_tasks 10202 1727204066.31616: done queuing things up, now waiting for results queue to drain 10202 1727204066.31617: waiting for pending results... 10202 1727204066.31760: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10202 1727204066.31950: in run() - task 127b8e07-fff9-0b04-2570-00000000007c 10202 1727204066.31979: variable 'ansible_search_path' from source: unknown 10202 1727204066.31988: variable 'ansible_search_path' from source: unknown 10202 1727204066.32037: calling self._execute() 10202 1727204066.32148: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.32163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.32184: variable 'omit' from source: magic vars 10202 1727204066.32616: variable 'ansible_distribution_major_version' from source: facts 10202 1727204066.32636: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204066.32718: _execute() done 10202 1727204066.32722: dumping result to json 10202 1727204066.32724: done dumping result, returning 10202 1727204066.32727: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-0b04-2570-00000000007c] 10202 1727204066.32729: sending task result for task 127b8e07-fff9-0b04-2570-00000000007c 10202 1727204066.32980: done sending task result for task 127b8e07-fff9-0b04-2570-00000000007c 10202 1727204066.32984: WORKER PROCESS EXITING 10202 1727204066.33033: no more pending results, returning what we have 10202 1727204066.33038: in VariableManager get_vars() 10202 1727204066.33095: Calling all_inventory to load vars for managed-node3 10202 1727204066.33098: Calling groups_inventory to load vars for managed-node3 10202 1727204066.33101: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.33116: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.33119: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.33122: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.34947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.37104: done with get_vars() 10202 1727204066.37141: variable 'ansible_search_path' from source: unknown 10202 1727204066.37142: variable 'ansible_search_path' from source: unknown 10202 1727204066.37193: we have included files to process 10202 1727204066.37194: generating all_blocks data 10202 1727204066.37197: done generating all_blocks data 10202 1727204066.37203: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204066.37204: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204066.37206: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10202 1727204066.37834: done processing included file 10202 1727204066.37837: iterating over new_blocks loaded from include file 10202 1727204066.37838: in VariableManager get_vars() 10202 1727204066.37871: done with get_vars() 10202 1727204066.37874: filtering new block on tags 10202 1727204066.37908: done filtering new block on tags 10202 1727204066.37911: in VariableManager get_vars() 10202 1727204066.37939: done with get_vars() 10202 1727204066.37941: filtering new block on tags 10202 1727204066.37989: done filtering new block on tags 10202 1727204066.37992: in VariableManager get_vars() 10202 1727204066.38019: done with get_vars() 10202 1727204066.38021: filtering new block on tags 10202 1727204066.38061: done filtering new block on tags 10202 1727204066.38063: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 10202 1727204066.38072: extending task lists for all hosts with included blocks 10202 1727204066.39313: done extending task lists 10202 1727204066.39315: done processing included files 10202 1727204066.39316: results queue empty 10202 1727204066.39317: checking for any_errors_fatal 10202 1727204066.39322: done checking for any_errors_fatal 10202 1727204066.39323: checking for max_fail_percentage 10202 1727204066.39324: done checking for max_fail_percentage 10202 1727204066.39325: checking to see if all hosts have failed and the running result is not ok 10202 1727204066.39326: done checking to see if all hosts have failed 10202 1727204066.39327: getting the remaining hosts for this loop 10202 1727204066.39328: done getting the remaining hosts for this loop 10202 1727204066.39331: getting the next task for host managed-node3 10202 1727204066.39336: done getting next task for host managed-node3 10202 1727204066.39339: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10202 1727204066.39342: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204066.39354: getting variables 10202 1727204066.39355: in VariableManager get_vars() 10202 1727204066.39378: Calling all_inventory to load vars for managed-node3 10202 1727204066.39381: Calling groups_inventory to load vars for managed-node3 10202 1727204066.39384: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.39390: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.39393: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.39397: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.40837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.42973: done with get_vars() 10202 1727204066.43009: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.120) 0:00:28.106 ***** 10202 1727204066.43107: entering _queue_task() for managed-node3/setup 10202 1727204066.43532: worker is 1 (out of 1 available) 10202 1727204066.43548: exiting _queue_task() for managed-node3/setup 10202 1727204066.43562: done queuing things up, now waiting for results queue to drain 10202 1727204066.43564: waiting for pending results... 10202 1727204066.43888: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10202 1727204066.44088: in run() - task 127b8e07-fff9-0b04-2570-000000000491 10202 1727204066.44116: variable 'ansible_search_path' from source: unknown 10202 1727204066.44171: variable 'ansible_search_path' from source: unknown 10202 1727204066.44175: calling self._execute() 10202 1727204066.44274: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.44286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.44300: variable 'omit' from source: magic vars 10202 1727204066.44740: variable 'ansible_distribution_major_version' from source: facts 10202 1727204066.44759: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204066.45384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204066.48360: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204066.48776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204066.48826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204066.48877: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204066.48911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204066.49012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204066.49053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204066.49091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204066.49141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204066.49161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204066.49230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204066.49255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204066.49300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204066.49357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204066.49404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204066.49645: variable '__network_required_facts' from source: role '' defaults 10202 1727204066.49660: variable 'ansible_facts' from source: unknown 10202 1727204066.50606: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10202 1727204066.50621: when evaluation is False, skipping this task 10202 1727204066.50632: _execute() done 10202 1727204066.50640: dumping result to json 10202 1727204066.50648: done dumping result, returning 10202 1727204066.50659: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-0b04-2570-000000000491] 10202 1727204066.50671: sending task result for task 127b8e07-fff9-0b04-2570-000000000491 10202 1727204066.50983: done sending task result for task 127b8e07-fff9-0b04-2570-000000000491 10202 1727204066.50987: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204066.51045: no more pending results, returning what we have 10202 1727204066.51049: results queue empty 10202 1727204066.51050: checking for any_errors_fatal 10202 1727204066.51052: done checking for any_errors_fatal 10202 1727204066.51053: checking for max_fail_percentage 10202 1727204066.51055: done checking for max_fail_percentage 10202 1727204066.51056: checking to see if all hosts have failed and the running result is not ok 10202 1727204066.51057: done checking to see if all hosts have failed 10202 1727204066.51058: getting the remaining hosts for this loop 10202 1727204066.51060: done getting the remaining hosts for this loop 10202 1727204066.51065: getting the next task for host managed-node3 10202 1727204066.51077: done getting next task for host managed-node3 10202 1727204066.51082: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10202 1727204066.51088: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204066.51108: getting variables 10202 1727204066.51110: in VariableManager get_vars() 10202 1727204066.51159: Calling all_inventory to load vars for managed-node3 10202 1727204066.51163: Calling groups_inventory to load vars for managed-node3 10202 1727204066.51269: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.51283: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.51286: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.51289: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.53358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.56393: done with get_vars() 10202 1727204066.56438: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.134) 0:00:28.240 ***** 10202 1727204066.56560: entering _queue_task() for managed-node3/stat 10202 1727204066.57458: worker is 1 (out of 1 available) 10202 1727204066.57578: exiting _queue_task() for managed-node3/stat 10202 1727204066.57592: done queuing things up, now waiting for results queue to drain 10202 1727204066.57594: waiting for pending results... 10202 1727204066.57990: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 10202 1727204066.58617: in run() - task 127b8e07-fff9-0b04-2570-000000000493 10202 1727204066.58656: variable 'ansible_search_path' from source: unknown 10202 1727204066.58660: variable 'ansible_search_path' from source: unknown 10202 1727204066.58763: calling self._execute() 10202 1727204066.58786: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.58794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.58804: variable 'omit' from source: magic vars 10202 1727204066.59718: variable 'ansible_distribution_major_version' from source: facts 10202 1727204066.59733: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204066.59994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204066.60380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204066.60384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204066.60387: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204066.60389: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204066.60549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204066.60579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204066.60605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204066.60633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204066.60734: variable '__network_is_ostree' from source: set_fact 10202 1727204066.60871: Evaluated conditional (not __network_is_ostree is defined): False 10202 1727204066.60874: when evaluation is False, skipping this task 10202 1727204066.60876: _execute() done 10202 1727204066.60878: dumping result to json 10202 1727204066.60880: done dumping result, returning 10202 1727204066.60883: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-0b04-2570-000000000493] 10202 1727204066.60885: sending task result for task 127b8e07-fff9-0b04-2570-000000000493 10202 1727204066.60958: done sending task result for task 127b8e07-fff9-0b04-2570-000000000493 10202 1727204066.60961: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10202 1727204066.61014: no more pending results, returning what we have 10202 1727204066.61017: results queue empty 10202 1727204066.61018: checking for any_errors_fatal 10202 1727204066.61024: done checking for any_errors_fatal 10202 1727204066.61025: checking for max_fail_percentage 10202 1727204066.61026: done checking for max_fail_percentage 10202 1727204066.61029: checking to see if all hosts have failed and the running result is not ok 10202 1727204066.61031: done checking to see if all hosts have failed 10202 1727204066.61032: getting the remaining hosts for this loop 10202 1727204066.61033: done getting the remaining hosts for this loop 10202 1727204066.61036: getting the next task for host managed-node3 10202 1727204066.61043: done getting next task for host managed-node3 10202 1727204066.61046: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10202 1727204066.61051: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204066.61071: getting variables 10202 1727204066.61072: in VariableManager get_vars() 10202 1727204066.61111: Calling all_inventory to load vars for managed-node3 10202 1727204066.61114: Calling groups_inventory to load vars for managed-node3 10202 1727204066.61116: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.61125: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.61131: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.61134: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.63185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.67368: done with get_vars() 10202 1727204066.67406: done getting variables 10202 1727204066.67541: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.110) 0:00:28.351 ***** 10202 1727204066.67590: entering _queue_task() for managed-node3/set_fact 10202 1727204066.68312: worker is 1 (out of 1 available) 10202 1727204066.68326: exiting _queue_task() for managed-node3/set_fact 10202 1727204066.68341: done queuing things up, now waiting for results queue to drain 10202 1727204066.68343: waiting for pending results... 10202 1727204066.68743: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10202 1727204066.68907: in run() - task 127b8e07-fff9-0b04-2570-000000000494 10202 1727204066.68912: variable 'ansible_search_path' from source: unknown 10202 1727204066.68917: variable 'ansible_search_path' from source: unknown 10202 1727204066.68952: calling self._execute() 10202 1727204066.69062: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.69070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.69124: variable 'omit' from source: magic vars 10202 1727204066.69550: variable 'ansible_distribution_major_version' from source: facts 10202 1727204066.69558: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204066.69852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204066.70211: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204066.70230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204066.70271: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204066.70317: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204066.70470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204066.70504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204066.70570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204066.70578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204066.70691: variable '__network_is_ostree' from source: set_fact 10202 1727204066.70705: Evaluated conditional (not __network_is_ostree is defined): False 10202 1727204066.70714: when evaluation is False, skipping this task 10202 1727204066.70721: _execute() done 10202 1727204066.70730: dumping result to json 10202 1727204066.70754: done dumping result, returning 10202 1727204066.70758: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-0b04-2570-000000000494] 10202 1727204066.70770: sending task result for task 127b8e07-fff9-0b04-2570-000000000494 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10202 1727204066.70987: no more pending results, returning what we have 10202 1727204066.70992: results queue empty 10202 1727204066.70993: checking for any_errors_fatal 10202 1727204066.70999: done checking for any_errors_fatal 10202 1727204066.71000: checking for max_fail_percentage 10202 1727204066.71002: done checking for max_fail_percentage 10202 1727204066.71003: checking to see if all hosts have failed and the running result is not ok 10202 1727204066.71004: done checking to see if all hosts have failed 10202 1727204066.71005: getting the remaining hosts for this loop 10202 1727204066.71007: done getting the remaining hosts for this loop 10202 1727204066.71011: getting the next task for host managed-node3 10202 1727204066.71021: done getting next task for host managed-node3 10202 1727204066.71025: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10202 1727204066.71033: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204066.71052: getting variables 10202 1727204066.71054: in VariableManager get_vars() 10202 1727204066.71101: Calling all_inventory to load vars for managed-node3 10202 1727204066.71104: Calling groups_inventory to load vars for managed-node3 10202 1727204066.71106: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204066.71117: Calling all_plugins_play to load vars for managed-node3 10202 1727204066.71120: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204066.71124: Calling groups_plugins_play to load vars for managed-node3 10202 1727204066.71668: done sending task result for task 127b8e07-fff9-0b04-2570-000000000494 10202 1727204066.71672: WORKER PROCESS EXITING 10202 1727204066.72712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204066.73896: done with get_vars() 10202 1727204066.73922: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.064) 0:00:28.415 ***** 10202 1727204066.74009: entering _queue_task() for managed-node3/service_facts 10202 1727204066.74295: worker is 1 (out of 1 available) 10202 1727204066.74310: exiting _queue_task() for managed-node3/service_facts 10202 1727204066.74322: done queuing things up, now waiting for results queue to drain 10202 1727204066.74324: waiting for pending results... 10202 1727204066.74530: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 10202 1727204066.74665: in run() - task 127b8e07-fff9-0b04-2570-000000000496 10202 1727204066.74697: variable 'ansible_search_path' from source: unknown 10202 1727204066.74701: variable 'ansible_search_path' from source: unknown 10202 1727204066.74732: calling self._execute() 10202 1727204066.74903: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.74907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.74910: variable 'omit' from source: magic vars 10202 1727204066.75466: variable 'ansible_distribution_major_version' from source: facts 10202 1727204066.75475: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204066.75479: variable 'omit' from source: magic vars 10202 1727204066.75671: variable 'omit' from source: magic vars 10202 1727204066.75675: variable 'omit' from source: magic vars 10202 1727204066.75678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204066.75681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204066.75683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204066.75711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204066.75729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204066.75767: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204066.75777: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.75785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.75915: Set connection var ansible_shell_type to sh 10202 1727204066.75929: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204066.75940: Set connection var ansible_connection to ssh 10202 1727204066.75950: Set connection var ansible_shell_executable to /bin/sh 10202 1727204066.75960: Set connection var ansible_pipelining to False 10202 1727204066.75973: Set connection var ansible_timeout to 10 10202 1727204066.76007: variable 'ansible_shell_executable' from source: unknown 10202 1727204066.76027: variable 'ansible_connection' from source: unknown 10202 1727204066.76035: variable 'ansible_module_compression' from source: unknown 10202 1727204066.76125: variable 'ansible_shell_type' from source: unknown 10202 1727204066.76128: variable 'ansible_shell_executable' from source: unknown 10202 1727204066.76131: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204066.76134: variable 'ansible_pipelining' from source: unknown 10202 1727204066.76136: variable 'ansible_timeout' from source: unknown 10202 1727204066.76138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204066.76365: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204066.76372: variable 'omit' from source: magic vars 10202 1727204066.76374: starting attempt loop 10202 1727204066.76377: running the handler 10202 1727204066.76379: _low_level_execute_command(): starting 10202 1727204066.76382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204066.76904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.76909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204066.76912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204066.76917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.76970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204066.76977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.76979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.77050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.78917: stdout chunk (state=3): >>>/root <<< 10202 1727204066.79132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204066.79136: stdout chunk (state=3): >>><<< 10202 1727204066.79138: stderr chunk (state=3): >>><<< 10202 1727204066.79159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204066.79184: _low_level_execute_command(): starting 10202 1727204066.79272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718 `" && echo ansible-tmp-1727204066.7916849-11774-81761116789718="` echo /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718 `" ) && sleep 0' 10202 1727204066.79856: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204066.79880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.79901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.79951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204066.79970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.79973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.80037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.82233: stdout chunk (state=3): >>>ansible-tmp-1727204066.7916849-11774-81761116789718=/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718 <<< 10202 1727204066.82340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204066.82402: stderr chunk (state=3): >>><<< 10202 1727204066.82406: stdout chunk (state=3): >>><<< 10202 1727204066.82423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.7916849-11774-81761116789718=/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204066.82473: variable 'ansible_module_compression' from source: unknown 10202 1727204066.82512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10202 1727204066.82550: variable 'ansible_facts' from source: unknown 10202 1727204066.82611: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py 10202 1727204066.82730: Sending initial data 10202 1727204066.82734: Sent initial data (161 bytes) 10202 1727204066.83235: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204066.83239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.83243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 10202 1727204066.83245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204066.83248: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.83299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204066.83306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.83308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.83381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.85157: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204066.85223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204066.85289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7lg2zf5a /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py <<< 10202 1727204066.85298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py" <<< 10202 1727204066.85357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7lg2zf5a" to remote "/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py" <<< 10202 1727204066.85360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py" <<< 10202 1727204066.86029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204066.86106: stderr chunk (state=3): >>><<< 10202 1727204066.86110: stdout chunk (state=3): >>><<< 10202 1727204066.86130: done transferring module to remote 10202 1727204066.86140: _low_level_execute_command(): starting 10202 1727204066.86144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/ /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py && sleep 0' 10202 1727204066.86650: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.86654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.86657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204066.86663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.86669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.86715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204066.86718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.86723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.86795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204066.88827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204066.88886: stderr chunk (state=3): >>><<< 10202 1727204066.88890: stdout chunk (state=3): >>><<< 10202 1727204066.88903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204066.88907: _low_level_execute_command(): starting 10202 1727204066.88914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/AnsiballZ_service_facts.py && sleep 0' 10202 1727204066.89425: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.89432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.89435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204066.89437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204066.89490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204066.89497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204066.89499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204066.89581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.31780: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10202 1727204069.33356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.33472: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 10202 1727204069.33516: stderr chunk (state=3): >>><<< 10202 1727204069.33568: stdout chunk (state=3): >>><<< 10202 1727204069.33764: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204069.37068: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204069.37073: _low_level_execute_command(): starting 10202 1727204069.37076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.7916849-11774-81761116789718/ > /dev/null 2>&1 && sleep 0' 10202 1727204069.38615: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204069.38826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204069.38890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204069.38916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204069.38988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.39071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.41320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.41324: stdout chunk (state=3): >>><<< 10202 1727204069.41326: stderr chunk (state=3): >>><<< 10202 1727204069.41332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204069.41334: handler run complete 10202 1727204069.41679: variable 'ansible_facts' from source: unknown 10202 1727204069.42557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204069.43760: variable 'ansible_facts' from source: unknown 10202 1727204069.44008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204069.44856: attempt loop complete, returning result 10202 1727204069.44862: _execute() done 10202 1727204069.44868: dumping result to json 10202 1727204069.44936: done dumping result, returning 10202 1727204069.44970: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-0b04-2570-000000000496] 10202 1727204069.44973: sending task result for task 127b8e07-fff9-0b04-2570-000000000496 10202 1727204069.48427: done sending task result for task 127b8e07-fff9-0b04-2570-000000000496 10202 1727204069.48434: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204069.48547: no more pending results, returning what we have 10202 1727204069.48550: results queue empty 10202 1727204069.48551: checking for any_errors_fatal 10202 1727204069.48555: done checking for any_errors_fatal 10202 1727204069.48556: checking for max_fail_percentage 10202 1727204069.48557: done checking for max_fail_percentage 10202 1727204069.48558: checking to see if all hosts have failed and the running result is not ok 10202 1727204069.48559: done checking to see if all hosts have failed 10202 1727204069.48560: getting the remaining hosts for this loop 10202 1727204069.48561: done getting the remaining hosts for this loop 10202 1727204069.48568: getting the next task for host managed-node3 10202 1727204069.48574: done getting next task for host managed-node3 10202 1727204069.48578: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10202 1727204069.48584: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204069.48595: getting variables 10202 1727204069.48597: in VariableManager get_vars() 10202 1727204069.48632: Calling all_inventory to load vars for managed-node3 10202 1727204069.48635: Calling groups_inventory to load vars for managed-node3 10202 1727204069.48638: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204069.48648: Calling all_plugins_play to load vars for managed-node3 10202 1727204069.48651: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204069.48654: Calling groups_plugins_play to load vars for managed-node3 10202 1727204069.52334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204069.57296: done with get_vars() 10202 1727204069.57346: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:29 -0400 (0:00:02.836) 0:00:31.251 ***** 10202 1727204069.57673: entering _queue_task() for managed-node3/package_facts 10202 1727204069.58504: worker is 1 (out of 1 available) 10202 1727204069.58525: exiting _queue_task() for managed-node3/package_facts 10202 1727204069.58541: done queuing things up, now waiting for results queue to drain 10202 1727204069.58543: waiting for pending results... 10202 1727204069.59284: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 10202 1727204069.59380: in run() - task 127b8e07-fff9-0b04-2570-000000000497 10202 1727204069.59407: variable 'ansible_search_path' from source: unknown 10202 1727204069.59677: variable 'ansible_search_path' from source: unknown 10202 1727204069.59681: calling self._execute() 10202 1727204069.59740: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204069.60072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204069.60076: variable 'omit' from source: magic vars 10202 1727204069.60624: variable 'ansible_distribution_major_version' from source: facts 10202 1727204069.61073: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204069.61077: variable 'omit' from source: magic vars 10202 1727204069.61080: variable 'omit' from source: magic vars 10202 1727204069.61083: variable 'omit' from source: magic vars 10202 1727204069.61101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204069.61147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204069.61471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204069.61476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204069.61479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204069.61483: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204069.61487: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204069.61494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204069.61611: Set connection var ansible_shell_type to sh 10202 1727204069.61871: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204069.61874: Set connection var ansible_connection to ssh 10202 1727204069.61877: Set connection var ansible_shell_executable to /bin/sh 10202 1727204069.61880: Set connection var ansible_pipelining to False 10202 1727204069.61882: Set connection var ansible_timeout to 10 10202 1727204069.61885: variable 'ansible_shell_executable' from source: unknown 10202 1727204069.61887: variable 'ansible_connection' from source: unknown 10202 1727204069.61890: variable 'ansible_module_compression' from source: unknown 10202 1727204069.61892: variable 'ansible_shell_type' from source: unknown 10202 1727204069.61894: variable 'ansible_shell_executable' from source: unknown 10202 1727204069.61896: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204069.61898: variable 'ansible_pipelining' from source: unknown 10202 1727204069.61900: variable 'ansible_timeout' from source: unknown 10202 1727204069.61902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204069.62315: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204069.62337: variable 'omit' from source: magic vars 10202 1727204069.62349: starting attempt loop 10202 1727204069.62357: running the handler 10202 1727204069.62671: _low_level_execute_command(): starting 10202 1727204069.62675: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204069.64198: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204069.64295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.64386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.66222: stdout chunk (state=3): >>>/root <<< 10202 1727204069.66370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.66452: stderr chunk (state=3): >>><<< 10202 1727204069.66477: stdout chunk (state=3): >>><<< 10202 1727204069.66791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204069.66796: _low_level_execute_command(): starting 10202 1727204069.66799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692 `" && echo ansible-tmp-1727204069.6668932-12009-270263090526692="` echo /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692 `" ) && sleep 0' 10202 1727204069.67960: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204069.68284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204069.68308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.68469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.70676: stdout chunk (state=3): >>>ansible-tmp-1727204069.6668932-12009-270263090526692=/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692 <<< 10202 1727204069.70867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.70880: stdout chunk (state=3): >>><<< 10202 1727204069.70892: stderr chunk (state=3): >>><<< 10202 1727204069.70936: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.6668932-12009-270263090526692=/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204069.71027: variable 'ansible_module_compression' from source: unknown 10202 1727204069.71080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10202 1727204069.71145: variable 'ansible_facts' from source: unknown 10202 1727204069.71535: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py 10202 1727204069.72317: Sending initial data 10202 1727204069.72321: Sent initial data (162 bytes) 10202 1727204069.73584: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204069.73653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204069.73822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.73899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.75890: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204069.75895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204069.75898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpy0aax6w5 /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py <<< 10202 1727204069.75901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py" <<< 10202 1727204069.76036: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpy0aax6w5" to remote "/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py" <<< 10202 1727204069.80574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.80578: stderr chunk (state=3): >>><<< 10202 1727204069.80581: stdout chunk (state=3): >>><<< 10202 1727204069.80583: done transferring module to remote 10202 1727204069.80586: _low_level_execute_command(): starting 10202 1727204069.80588: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/ /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py && sleep 0' 10202 1727204069.81493: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204069.81498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 10202 1727204069.81605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204069.81620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204069.81983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.82017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204069.84114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204069.84229: stderr chunk (state=3): >>><<< 10202 1727204069.84284: stdout chunk (state=3): >>><<< 10202 1727204069.84302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204069.84312: _low_level_execute_command(): starting 10202 1727204069.84315: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/AnsiballZ_package_facts.py && sleep 0' 10202 1727204069.85814: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204069.85883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204069.85897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204069.86052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204070.50561: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 10202 1727204070.50627: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 10202 1727204070.50750: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 10202 1727204070.50778: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 10202 1727204070.50813: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10202 1727204070.53245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204070.53250: stdout chunk (state=3): >>><<< 10202 1727204070.53252: stderr chunk (state=3): >>><<< 10202 1727204070.53281: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204070.62674: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204070.62679: _low_level_execute_command(): starting 10202 1727204070.62682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.6668932-12009-270263090526692/ > /dev/null 2>&1 && sleep 0' 10202 1727204070.64063: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204070.64071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204070.64474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204070.64479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204070.64481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204070.66766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204070.66771: stdout chunk (state=3): >>><<< 10202 1727204070.66777: stderr chunk (state=3): >>><<< 10202 1727204070.66794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204070.66801: handler run complete 10202 1727204070.69823: variable 'ansible_facts' from source: unknown 10202 1727204070.72064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204070.80174: variable 'ansible_facts' from source: unknown 10202 1727204070.80972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204070.83085: attempt loop complete, returning result 10202 1727204070.83117: _execute() done 10202 1727204070.83471: dumping result to json 10202 1727204070.83750: done dumping result, returning 10202 1727204070.83769: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-0b04-2570-000000000497] 10202 1727204070.83781: sending task result for task 127b8e07-fff9-0b04-2570-000000000497 10202 1727204070.90509: done sending task result for task 127b8e07-fff9-0b04-2570-000000000497 10202 1727204070.90513: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204070.90775: no more pending results, returning what we have 10202 1727204070.90778: results queue empty 10202 1727204070.90779: checking for any_errors_fatal 10202 1727204070.90785: done checking for any_errors_fatal 10202 1727204070.90786: checking for max_fail_percentage 10202 1727204070.90787: done checking for max_fail_percentage 10202 1727204070.90788: checking to see if all hosts have failed and the running result is not ok 10202 1727204070.90789: done checking to see if all hosts have failed 10202 1727204070.90790: getting the remaining hosts for this loop 10202 1727204070.90791: done getting the remaining hosts for this loop 10202 1727204070.90794: getting the next task for host managed-node3 10202 1727204070.90803: done getting next task for host managed-node3 10202 1727204070.90807: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10202 1727204070.90812: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204070.90823: getting variables 10202 1727204070.90825: in VariableManager get_vars() 10202 1727204070.90861: Calling all_inventory to load vars for managed-node3 10202 1727204070.90984: Calling groups_inventory to load vars for managed-node3 10202 1727204070.90988: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204070.90998: Calling all_plugins_play to load vars for managed-node3 10202 1727204070.91001: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204070.91004: Calling groups_plugins_play to load vars for managed-node3 10202 1727204070.94542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204070.99480: done with get_vars() 10202 1727204070.99512: done getting variables 10202 1727204070.99696: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:30 -0400 (0:00:01.421) 0:00:32.672 ***** 10202 1727204070.99737: entering _queue_task() for managed-node3/debug 10202 1727204071.00731: worker is 1 (out of 1 available) 10202 1727204071.00748: exiting _queue_task() for managed-node3/debug 10202 1727204071.00762: done queuing things up, now waiting for results queue to drain 10202 1727204071.00764: waiting for pending results... 10202 1727204071.01495: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 10202 1727204071.01976: in run() - task 127b8e07-fff9-0b04-2570-00000000007d 10202 1727204071.01983: variable 'ansible_search_path' from source: unknown 10202 1727204071.01986: variable 'ansible_search_path' from source: unknown 10202 1727204071.01989: calling self._execute() 10202 1727204071.02162: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.02226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.02286: variable 'omit' from source: magic vars 10202 1727204071.03786: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.03790: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.03793: variable 'omit' from source: magic vars 10202 1727204071.03826: variable 'omit' from source: magic vars 10202 1727204071.04268: variable 'network_provider' from source: set_fact 10202 1727204071.04352: variable 'omit' from source: magic vars 10202 1727204071.04406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204071.04766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204071.04984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204071.04988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204071.04990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204071.04993: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204071.04996: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.04998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.05243: Set connection var ansible_shell_type to sh 10202 1727204071.05319: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204071.05362: Set connection var ansible_connection to ssh 10202 1727204071.05393: Set connection var ansible_shell_executable to /bin/sh 10202 1727204071.05404: Set connection var ansible_pipelining to False 10202 1727204071.05424: Set connection var ansible_timeout to 10 10202 1727204071.05673: variable 'ansible_shell_executable' from source: unknown 10202 1727204071.05677: variable 'ansible_connection' from source: unknown 10202 1727204071.05679: variable 'ansible_module_compression' from source: unknown 10202 1727204071.05681: variable 'ansible_shell_type' from source: unknown 10202 1727204071.05683: variable 'ansible_shell_executable' from source: unknown 10202 1727204071.05685: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.05687: variable 'ansible_pipelining' from source: unknown 10202 1727204071.05689: variable 'ansible_timeout' from source: unknown 10202 1727204071.05691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.05940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204071.05961: variable 'omit' from source: magic vars 10202 1727204071.06008: starting attempt loop 10202 1727204071.06017: running the handler 10202 1727204071.06271: handler run complete 10202 1727204071.06275: attempt loop complete, returning result 10202 1727204071.06277: _execute() done 10202 1727204071.06280: dumping result to json 10202 1727204071.06282: done dumping result, returning 10202 1727204071.06284: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-0b04-2570-00000000007d] 10202 1727204071.06286: sending task result for task 127b8e07-fff9-0b04-2570-00000000007d ok: [managed-node3] => {} MSG: Using network provider: nm 10202 1727204071.06476: no more pending results, returning what we have 10202 1727204071.06480: results queue empty 10202 1727204071.06481: checking for any_errors_fatal 10202 1727204071.06495: done checking for any_errors_fatal 10202 1727204071.06495: checking for max_fail_percentage 10202 1727204071.06497: done checking for max_fail_percentage 10202 1727204071.06498: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.06499: done checking to see if all hosts have failed 10202 1727204071.06500: getting the remaining hosts for this loop 10202 1727204071.06506: done getting the remaining hosts for this loop 10202 1727204071.06511: getting the next task for host managed-node3 10202 1727204071.06519: done getting next task for host managed-node3 10202 1727204071.06524: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10202 1727204071.06529: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.06542: getting variables 10202 1727204071.06544: in VariableManager get_vars() 10202 1727204071.06642: Calling all_inventory to load vars for managed-node3 10202 1727204071.06645: Calling groups_inventory to load vars for managed-node3 10202 1727204071.06647: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.06660: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.06662: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.06668: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.07189: done sending task result for task 127b8e07-fff9-0b04-2570-00000000007d 10202 1727204071.07207: WORKER PROCESS EXITING 10202 1727204071.10443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.15741: done with get_vars() 10202 1727204071.15889: done getting variables 10202 1727204071.15968: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.162) 0:00:32.835 ***** 10202 1727204071.16008: entering _queue_task() for managed-node3/fail 10202 1727204071.16943: worker is 1 (out of 1 available) 10202 1727204071.16959: exiting _queue_task() for managed-node3/fail 10202 1727204071.17077: done queuing things up, now waiting for results queue to drain 10202 1727204071.17079: waiting for pending results... 10202 1727204071.17899: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10202 1727204071.18864: in run() - task 127b8e07-fff9-0b04-2570-00000000007e 10202 1727204071.18871: variable 'ansible_search_path' from source: unknown 10202 1727204071.18875: variable 'ansible_search_path' from source: unknown 10202 1727204071.18878: calling self._execute() 10202 1727204071.19352: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.19357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.19359: variable 'omit' from source: magic vars 10202 1727204071.20773: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.21038: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.21383: variable 'network_state' from source: role '' defaults 10202 1727204071.21694: Evaluated conditional (network_state != {}): False 10202 1727204071.21699: when evaluation is False, skipping this task 10202 1727204071.21701: _execute() done 10202 1727204071.21703: dumping result to json 10202 1727204071.21706: done dumping result, returning 10202 1727204071.21708: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-0b04-2570-00000000007e] 10202 1727204071.21712: sending task result for task 127b8e07-fff9-0b04-2570-00000000007e skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204071.22133: no more pending results, returning what we have 10202 1727204071.22137: results queue empty 10202 1727204071.22138: checking for any_errors_fatal 10202 1727204071.22148: done checking for any_errors_fatal 10202 1727204071.22149: checking for max_fail_percentage 10202 1727204071.22150: done checking for max_fail_percentage 10202 1727204071.22152: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.22153: done checking to see if all hosts have failed 10202 1727204071.22154: getting the remaining hosts for this loop 10202 1727204071.22155: done getting the remaining hosts for this loop 10202 1727204071.22161: getting the next task for host managed-node3 10202 1727204071.22170: done getting next task for host managed-node3 10202 1727204071.22175: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10202 1727204071.22179: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.22207: getting variables 10202 1727204071.22210: in VariableManager get_vars() 10202 1727204071.22261: Calling all_inventory to load vars for managed-node3 10202 1727204071.22568: Calling groups_inventory to load vars for managed-node3 10202 1727204071.22573: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.22585: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.22588: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.22591: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.23220: done sending task result for task 127b8e07-fff9-0b04-2570-00000000007e 10202 1727204071.23226: WORKER PROCESS EXITING 10202 1727204071.28531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.34988: done with get_vars() 10202 1727204071.35033: done getting variables 10202 1727204071.35186: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.193) 0:00:33.028 ***** 10202 1727204071.35345: entering _queue_task() for managed-node3/fail 10202 1727204071.36213: worker is 1 (out of 1 available) 10202 1727204071.36226: exiting _queue_task() for managed-node3/fail 10202 1727204071.36241: done queuing things up, now waiting for results queue to drain 10202 1727204071.36243: waiting for pending results... 10202 1727204071.36797: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10202 1727204071.37130: in run() - task 127b8e07-fff9-0b04-2570-00000000007f 10202 1727204071.37194: variable 'ansible_search_path' from source: unknown 10202 1727204071.37285: variable 'ansible_search_path' from source: unknown 10202 1727204071.37332: calling self._execute() 10202 1727204071.37558: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.37575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.37626: variable 'omit' from source: magic vars 10202 1727204071.38647: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.38697: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.39001: variable 'network_state' from source: role '' defaults 10202 1727204071.39019: Evaluated conditional (network_state != {}): False 10202 1727204071.39036: when evaluation is False, skipping this task 10202 1727204071.39138: _execute() done 10202 1727204071.39141: dumping result to json 10202 1727204071.39144: done dumping result, returning 10202 1727204071.39147: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-0b04-2570-00000000007f] 10202 1727204071.39150: sending task result for task 127b8e07-fff9-0b04-2570-00000000007f 10202 1727204071.39517: done sending task result for task 127b8e07-fff9-0b04-2570-00000000007f 10202 1727204071.39521: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204071.39691: no more pending results, returning what we have 10202 1727204071.39695: results queue empty 10202 1727204071.39697: checking for any_errors_fatal 10202 1727204071.39709: done checking for any_errors_fatal 10202 1727204071.39709: checking for max_fail_percentage 10202 1727204071.39712: done checking for max_fail_percentage 10202 1727204071.39713: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.39714: done checking to see if all hosts have failed 10202 1727204071.39715: getting the remaining hosts for this loop 10202 1727204071.39716: done getting the remaining hosts for this loop 10202 1727204071.39722: getting the next task for host managed-node3 10202 1727204071.39732: done getting next task for host managed-node3 10202 1727204071.39737: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10202 1727204071.39743: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.39772: getting variables 10202 1727204071.39775: in VariableManager get_vars() 10202 1727204071.39825: Calling all_inventory to load vars for managed-node3 10202 1727204071.39832: Calling groups_inventory to load vars for managed-node3 10202 1727204071.39835: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.39850: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.39854: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.39858: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.41998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.45752: done with get_vars() 10202 1727204071.45793: done getting variables 10202 1727204071.45857: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.105) 0:00:33.134 ***** 10202 1727204071.45904: entering _queue_task() for managed-node3/fail 10202 1727204071.46361: worker is 1 (out of 1 available) 10202 1727204071.46381: exiting _queue_task() for managed-node3/fail 10202 1727204071.46396: done queuing things up, now waiting for results queue to drain 10202 1727204071.46397: waiting for pending results... 10202 1727204071.46784: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10202 1727204071.46882: in run() - task 127b8e07-fff9-0b04-2570-000000000080 10202 1727204071.46907: variable 'ansible_search_path' from source: unknown 10202 1727204071.46917: variable 'ansible_search_path' from source: unknown 10202 1727204071.46968: calling self._execute() 10202 1727204071.47081: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.47100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.47236: variable 'omit' from source: magic vars 10202 1727204071.47567: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.47587: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.47798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204071.51523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204071.51617: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204071.51673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204071.51716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204071.51757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204071.51849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.51891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.51922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.51978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.51999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.52114: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.52140: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10202 1727204071.52292: variable 'ansible_distribution' from source: facts 10202 1727204071.52295: variable '__network_rh_distros' from source: role '' defaults 10202 1727204071.52371: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10202 1727204071.52374: when evaluation is False, skipping this task 10202 1727204071.52377: _execute() done 10202 1727204071.52379: dumping result to json 10202 1727204071.52383: done dumping result, returning 10202 1727204071.52385: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-0b04-2570-000000000080] 10202 1727204071.52388: sending task result for task 127b8e07-fff9-0b04-2570-000000000080 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10202 1727204071.52527: no more pending results, returning what we have 10202 1727204071.52531: results queue empty 10202 1727204071.52532: checking for any_errors_fatal 10202 1727204071.52538: done checking for any_errors_fatal 10202 1727204071.52539: checking for max_fail_percentage 10202 1727204071.52541: done checking for max_fail_percentage 10202 1727204071.52542: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.52543: done checking to see if all hosts have failed 10202 1727204071.52544: getting the remaining hosts for this loop 10202 1727204071.52546: done getting the remaining hosts for this loop 10202 1727204071.52550: getting the next task for host managed-node3 10202 1727204071.52557: done getting next task for host managed-node3 10202 1727204071.52561: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10202 1727204071.52567: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.52589: getting variables 10202 1727204071.52591: in VariableManager get_vars() 10202 1727204071.52632: Calling all_inventory to load vars for managed-node3 10202 1727204071.52635: Calling groups_inventory to load vars for managed-node3 10202 1727204071.52637: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.52649: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.52652: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.52655: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.53172: done sending task result for task 127b8e07-fff9-0b04-2570-000000000080 10202 1727204071.53175: WORKER PROCESS EXITING 10202 1727204071.56768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.60297: done with get_vars() 10202 1727204071.60405: done getting variables 10202 1727204071.60642: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.147) 0:00:33.282 ***** 10202 1727204071.60734: entering _queue_task() for managed-node3/dnf 10202 1727204071.61751: worker is 1 (out of 1 available) 10202 1727204071.61823: exiting _queue_task() for managed-node3/dnf 10202 1727204071.61837: done queuing things up, now waiting for results queue to drain 10202 1727204071.61843: waiting for pending results... 10202 1727204071.62255: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10202 1727204071.62395: in run() - task 127b8e07-fff9-0b04-2570-000000000081 10202 1727204071.62418: variable 'ansible_search_path' from source: unknown 10202 1727204071.62433: variable 'ansible_search_path' from source: unknown 10202 1727204071.62491: calling self._execute() 10202 1727204071.62622: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.62692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.62703: variable 'omit' from source: magic vars 10202 1727204071.63150: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.63171: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.63437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204071.67680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204071.67974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204071.68072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204071.68076: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204071.68144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204071.68439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.68595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.68683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.68780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.68799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.68951: variable 'ansible_distribution' from source: facts 10202 1727204071.68973: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.68987: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10202 1727204071.69156: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204071.69380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.69425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.69459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.69532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.69570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.69603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.69648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.69680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.69747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.69762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.69836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.69849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.69886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.69935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.70053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.70373: variable 'network_connections' from source: task vars 10202 1727204071.70403: variable 'port2_profile' from source: play vars 10202 1727204071.70571: variable 'port2_profile' from source: play vars 10202 1727204071.70588: variable 'port1_profile' from source: play vars 10202 1727204071.70733: variable 'port1_profile' from source: play vars 10202 1727204071.70739: variable 'controller_profile' from source: play vars 10202 1727204071.70817: variable 'controller_profile' from source: play vars 10202 1727204071.71064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204071.72277: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204071.72386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204071.72421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204071.72459: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204071.72637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204071.72816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204071.72820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.72886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204071.73044: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204071.73582: variable 'network_connections' from source: task vars 10202 1727204071.73606: variable 'port2_profile' from source: play vars 10202 1727204071.73971: variable 'port2_profile' from source: play vars 10202 1727204071.73974: variable 'port1_profile' from source: play vars 10202 1727204071.73977: variable 'port1_profile' from source: play vars 10202 1727204071.73979: variable 'controller_profile' from source: play vars 10202 1727204071.73981: variable 'controller_profile' from source: play vars 10202 1727204071.73984: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204071.73986: when evaluation is False, skipping this task 10202 1727204071.73989: _execute() done 10202 1727204071.73991: dumping result to json 10202 1727204071.73993: done dumping result, returning 10202 1727204071.73996: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-000000000081] 10202 1727204071.73998: sending task result for task 127b8e07-fff9-0b04-2570-000000000081 10202 1727204071.74434: done sending task result for task 127b8e07-fff9-0b04-2570-000000000081 10202 1727204071.74438: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204071.74505: no more pending results, returning what we have 10202 1727204071.74509: results queue empty 10202 1727204071.74510: checking for any_errors_fatal 10202 1727204071.74517: done checking for any_errors_fatal 10202 1727204071.74518: checking for max_fail_percentage 10202 1727204071.74520: done checking for max_fail_percentage 10202 1727204071.74521: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.74523: done checking to see if all hosts have failed 10202 1727204071.74523: getting the remaining hosts for this loop 10202 1727204071.74525: done getting the remaining hosts for this loop 10202 1727204071.74533: getting the next task for host managed-node3 10202 1727204071.74542: done getting next task for host managed-node3 10202 1727204071.74550: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10202 1727204071.74557: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.74581: getting variables 10202 1727204071.74583: in VariableManager get_vars() 10202 1727204071.74638: Calling all_inventory to load vars for managed-node3 10202 1727204071.74642: Calling groups_inventory to load vars for managed-node3 10202 1727204071.74644: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.74661: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.74854: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.74861: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.77713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.80070: done with get_vars() 10202 1727204071.80118: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10202 1727204071.80209: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.195) 0:00:33.477 ***** 10202 1727204071.80255: entering _queue_task() for managed-node3/yum 10202 1727204071.80880: worker is 1 (out of 1 available) 10202 1727204071.80892: exiting _queue_task() for managed-node3/yum 10202 1727204071.80904: done queuing things up, now waiting for results queue to drain 10202 1727204071.80905: waiting for pending results... 10202 1727204071.81047: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10202 1727204071.81375: in run() - task 127b8e07-fff9-0b04-2570-000000000082 10202 1727204071.81421: variable 'ansible_search_path' from source: unknown 10202 1727204071.81425: variable 'ansible_search_path' from source: unknown 10202 1727204071.81499: calling self._execute() 10202 1727204071.81814: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.81818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.81821: variable 'omit' from source: magic vars 10202 1727204071.82394: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.82422: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.82820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204071.86726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204071.86826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204071.86884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204071.86935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204071.86975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204071.87075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204071.87119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204071.87160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204071.87253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204071.87256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204071.87375: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.87411: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10202 1727204071.87470: when evaluation is False, skipping this task 10202 1727204071.87474: _execute() done 10202 1727204071.87478: dumping result to json 10202 1727204071.87480: done dumping result, returning 10202 1727204071.87483: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-000000000082] 10202 1727204071.87486: sending task result for task 127b8e07-fff9-0b04-2570-000000000082 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10202 1727204071.87825: no more pending results, returning what we have 10202 1727204071.87831: results queue empty 10202 1727204071.87832: checking for any_errors_fatal 10202 1727204071.87839: done checking for any_errors_fatal 10202 1727204071.87840: checking for max_fail_percentage 10202 1727204071.87841: done checking for max_fail_percentage 10202 1727204071.87842: checking to see if all hosts have failed and the running result is not ok 10202 1727204071.87844: done checking to see if all hosts have failed 10202 1727204071.87845: getting the remaining hosts for this loop 10202 1727204071.87846: done getting the remaining hosts for this loop 10202 1727204071.87851: getting the next task for host managed-node3 10202 1727204071.87858: done getting next task for host managed-node3 10202 1727204071.87862: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10202 1727204071.87869: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204071.87891: getting variables 10202 1727204071.87893: in VariableManager get_vars() 10202 1727204071.87939: Calling all_inventory to load vars for managed-node3 10202 1727204071.87943: Calling groups_inventory to load vars for managed-node3 10202 1727204071.87945: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204071.87955: Calling all_plugins_play to load vars for managed-node3 10202 1727204071.87958: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204071.87961: Calling groups_plugins_play to load vars for managed-node3 10202 1727204071.88090: done sending task result for task 127b8e07-fff9-0b04-2570-000000000082 10202 1727204071.88093: WORKER PROCESS EXITING 10202 1727204071.90282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204071.93061: done with get_vars() 10202 1727204071.93096: done getting variables 10202 1727204071.93181: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.129) 0:00:33.607 ***** 10202 1727204071.93236: entering _queue_task() for managed-node3/fail 10202 1727204071.94018: worker is 1 (out of 1 available) 10202 1727204071.94032: exiting _queue_task() for managed-node3/fail 10202 1727204071.94044: done queuing things up, now waiting for results queue to drain 10202 1727204071.94045: waiting for pending results... 10202 1727204071.94274: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10202 1727204071.94508: in run() - task 127b8e07-fff9-0b04-2570-000000000083 10202 1727204071.94532: variable 'ansible_search_path' from source: unknown 10202 1727204071.94540: variable 'ansible_search_path' from source: unknown 10202 1727204071.94595: calling self._execute() 10202 1727204071.94725: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204071.94741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204071.94755: variable 'omit' from source: magic vars 10202 1727204071.95323: variable 'ansible_distribution_major_version' from source: facts 10202 1727204071.95372: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204071.95526: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204071.95851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204072.09431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204072.09508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204072.09709: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204072.09713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204072.09715: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204072.09761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.09834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.09893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.09934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.09948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.10030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.10049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.10107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.10143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.10157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.10276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.10280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.10283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.10333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.10339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.10624: variable 'network_connections' from source: task vars 10202 1727204072.10630: variable 'port2_profile' from source: play vars 10202 1727204072.10716: variable 'port2_profile' from source: play vars 10202 1727204072.10727: variable 'port1_profile' from source: play vars 10202 1727204072.10940: variable 'port1_profile' from source: play vars 10202 1727204072.10944: variable 'controller_profile' from source: play vars 10202 1727204072.11047: variable 'controller_profile' from source: play vars 10202 1727204072.11278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204072.11464: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204072.11505: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204072.11538: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204072.11585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204072.11634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204072.11694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204072.11720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.11747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204072.11798: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204072.12085: variable 'network_connections' from source: task vars 10202 1727204072.12089: variable 'port2_profile' from source: play vars 10202 1727204072.12183: variable 'port2_profile' from source: play vars 10202 1727204072.12187: variable 'port1_profile' from source: play vars 10202 1727204072.12259: variable 'port1_profile' from source: play vars 10202 1727204072.12280: variable 'controller_profile' from source: play vars 10202 1727204072.12326: variable 'controller_profile' from source: play vars 10202 1727204072.12354: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204072.12372: when evaluation is False, skipping this task 10202 1727204072.12375: _execute() done 10202 1727204072.12378: dumping result to json 10202 1727204072.12380: done dumping result, returning 10202 1727204072.12385: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-000000000083] 10202 1727204072.12388: sending task result for task 127b8e07-fff9-0b04-2570-000000000083 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204072.12647: no more pending results, returning what we have 10202 1727204072.12650: results queue empty 10202 1727204072.12651: checking for any_errors_fatal 10202 1727204072.12658: done checking for any_errors_fatal 10202 1727204072.12659: checking for max_fail_percentage 10202 1727204072.12661: done checking for max_fail_percentage 10202 1727204072.12662: checking to see if all hosts have failed and the running result is not ok 10202 1727204072.12663: done checking to see if all hosts have failed 10202 1727204072.12664: getting the remaining hosts for this loop 10202 1727204072.12667: done getting the remaining hosts for this loop 10202 1727204072.12671: getting the next task for host managed-node3 10202 1727204072.12679: done getting next task for host managed-node3 10202 1727204072.12685: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10202 1727204072.12689: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204072.12709: getting variables 10202 1727204072.12710: in VariableManager get_vars() 10202 1727204072.12751: Calling all_inventory to load vars for managed-node3 10202 1727204072.12754: Calling groups_inventory to load vars for managed-node3 10202 1727204072.12757: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204072.12805: Calling all_plugins_play to load vars for managed-node3 10202 1727204072.12811: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204072.12816: done sending task result for task 127b8e07-fff9-0b04-2570-000000000083 10202 1727204072.12819: WORKER PROCESS EXITING 10202 1727204072.12826: Calling groups_plugins_play to load vars for managed-node3 10202 1727204072.24916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204072.28260: done with get_vars() 10202 1727204072.28308: done getting variables 10202 1727204072.28370: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.352) 0:00:33.960 ***** 10202 1727204072.28510: entering _queue_task() for managed-node3/package 10202 1727204072.29333: worker is 1 (out of 1 available) 10202 1727204072.29351: exiting _queue_task() for managed-node3/package 10202 1727204072.29368: done queuing things up, now waiting for results queue to drain 10202 1727204072.29370: waiting for pending results... 10202 1727204072.29950: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 10202 1727204072.30193: in run() - task 127b8e07-fff9-0b04-2570-000000000084 10202 1727204072.30297: variable 'ansible_search_path' from source: unknown 10202 1727204072.30302: variable 'ansible_search_path' from source: unknown 10202 1727204072.30306: calling self._execute() 10202 1727204072.30422: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.30430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.30446: variable 'omit' from source: magic vars 10202 1727204072.30954: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.31060: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204072.31232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204072.31927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204072.32210: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204072.32282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204072.32337: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204072.32684: variable 'network_packages' from source: role '' defaults 10202 1727204072.32933: variable '__network_provider_setup' from source: role '' defaults 10202 1727204072.32990: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204072.33168: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204072.33259: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204072.33451: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204072.34076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204072.37468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204072.37518: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204072.37553: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204072.37582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204072.37741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204072.37813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.37838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.37856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.37918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.37922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.37960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.37980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.38010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.38046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.38058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.38298: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10202 1727204072.38473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.38477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.38479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.38502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.38517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.38671: variable 'ansible_python' from source: facts 10202 1727204072.38675: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10202 1727204072.38757: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204072.38852: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204072.38961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.38994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.39014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.39060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.39072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.39109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.39145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.39291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.39295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.39298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.39426: variable 'network_connections' from source: task vars 10202 1727204072.39430: variable 'port2_profile' from source: play vars 10202 1727204072.39676: variable 'port2_profile' from source: play vars 10202 1727204072.39679: variable 'port1_profile' from source: play vars 10202 1727204072.39682: variable 'port1_profile' from source: play vars 10202 1727204072.39684: variable 'controller_profile' from source: play vars 10202 1727204072.39707: variable 'controller_profile' from source: play vars 10202 1727204072.39790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204072.39819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204072.39852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.39891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204072.39937: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204072.40246: variable 'network_connections' from source: task vars 10202 1727204072.40251: variable 'port2_profile' from source: play vars 10202 1727204072.40351: variable 'port2_profile' from source: play vars 10202 1727204072.40380: variable 'port1_profile' from source: play vars 10202 1727204072.40452: variable 'port1_profile' from source: play vars 10202 1727204072.40462: variable 'controller_profile' from source: play vars 10202 1727204072.40534: variable 'controller_profile' from source: play vars 10202 1727204072.40567: variable '__network_packages_default_wireless' from source: role '' defaults 10202 1727204072.40625: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204072.40845: variable 'network_connections' from source: task vars 10202 1727204072.40848: variable 'port2_profile' from source: play vars 10202 1727204072.40899: variable 'port2_profile' from source: play vars 10202 1727204072.40906: variable 'port1_profile' from source: play vars 10202 1727204072.40954: variable 'port1_profile' from source: play vars 10202 1727204072.40961: variable 'controller_profile' from source: play vars 10202 1727204072.41012: variable 'controller_profile' from source: play vars 10202 1727204072.41036: variable '__network_packages_default_team' from source: role '' defaults 10202 1727204072.41094: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204072.41312: variable 'network_connections' from source: task vars 10202 1727204072.41317: variable 'port2_profile' from source: play vars 10202 1727204072.41368: variable 'port2_profile' from source: play vars 10202 1727204072.41374: variable 'port1_profile' from source: play vars 10202 1727204072.41421: variable 'port1_profile' from source: play vars 10202 1727204072.41431: variable 'controller_profile' from source: play vars 10202 1727204072.41480: variable 'controller_profile' from source: play vars 10202 1727204072.41522: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204072.41572: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204072.41578: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204072.41623: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204072.41782: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10202 1727204072.42128: variable 'network_connections' from source: task vars 10202 1727204072.42136: variable 'port2_profile' from source: play vars 10202 1727204072.42302: variable 'port2_profile' from source: play vars 10202 1727204072.42305: variable 'port1_profile' from source: play vars 10202 1727204072.42307: variable 'port1_profile' from source: play vars 10202 1727204072.42310: variable 'controller_profile' from source: play vars 10202 1727204072.42432: variable 'controller_profile' from source: play vars 10202 1727204072.42435: variable 'ansible_distribution' from source: facts 10202 1727204072.42438: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.42440: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.42442: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10202 1727204072.42579: variable 'ansible_distribution' from source: facts 10202 1727204072.42583: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.42585: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.42588: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10202 1727204072.42731: variable 'ansible_distribution' from source: facts 10202 1727204072.42737: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.42743: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.42816: variable 'network_provider' from source: set_fact 10202 1727204072.42821: variable 'ansible_facts' from source: unknown 10202 1727204072.43565: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10202 1727204072.43571: when evaluation is False, skipping this task 10202 1727204072.43574: _execute() done 10202 1727204072.43586: dumping result to json 10202 1727204072.43591: done dumping result, returning 10202 1727204072.43593: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-0b04-2570-000000000084] 10202 1727204072.43596: sending task result for task 127b8e07-fff9-0b04-2570-000000000084 10202 1727204072.43700: done sending task result for task 127b8e07-fff9-0b04-2570-000000000084 10202 1727204072.43708: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10202 1727204072.43758: no more pending results, returning what we have 10202 1727204072.43762: results queue empty 10202 1727204072.43763: checking for any_errors_fatal 10202 1727204072.43783: done checking for any_errors_fatal 10202 1727204072.43784: checking for max_fail_percentage 10202 1727204072.43786: done checking for max_fail_percentage 10202 1727204072.43787: checking to see if all hosts have failed and the running result is not ok 10202 1727204072.43788: done checking to see if all hosts have failed 10202 1727204072.43789: getting the remaining hosts for this loop 10202 1727204072.43791: done getting the remaining hosts for this loop 10202 1727204072.43809: getting the next task for host managed-node3 10202 1727204072.43817: done getting next task for host managed-node3 10202 1727204072.43821: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10202 1727204072.43824: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204072.43844: getting variables 10202 1727204072.43846: in VariableManager get_vars() 10202 1727204072.43889: Calling all_inventory to load vars for managed-node3 10202 1727204072.43892: Calling groups_inventory to load vars for managed-node3 10202 1727204072.43894: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204072.43913: Calling all_plugins_play to load vars for managed-node3 10202 1727204072.43916: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204072.43920: Calling groups_plugins_play to load vars for managed-node3 10202 1727204072.45020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204072.46204: done with get_vars() 10202 1727204072.46226: done getting variables 10202 1727204072.46279: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.177) 0:00:34.138 ***** 10202 1727204072.46310: entering _queue_task() for managed-node3/package 10202 1727204072.46606: worker is 1 (out of 1 available) 10202 1727204072.46621: exiting _queue_task() for managed-node3/package 10202 1727204072.46636: done queuing things up, now waiting for results queue to drain 10202 1727204072.46638: waiting for pending results... 10202 1727204072.46855: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10202 1727204072.46993: in run() - task 127b8e07-fff9-0b04-2570-000000000085 10202 1727204072.47004: variable 'ansible_search_path' from source: unknown 10202 1727204072.47008: variable 'ansible_search_path' from source: unknown 10202 1727204072.47042: calling self._execute() 10202 1727204072.47125: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.47133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.47143: variable 'omit' from source: magic vars 10202 1727204072.47461: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.47473: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204072.47567: variable 'network_state' from source: role '' defaults 10202 1727204072.47576: Evaluated conditional (network_state != {}): False 10202 1727204072.47579: when evaluation is False, skipping this task 10202 1727204072.47583: _execute() done 10202 1727204072.47585: dumping result to json 10202 1727204072.47590: done dumping result, returning 10202 1727204072.47598: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-0b04-2570-000000000085] 10202 1727204072.47604: sending task result for task 127b8e07-fff9-0b04-2570-000000000085 10202 1727204072.47713: done sending task result for task 127b8e07-fff9-0b04-2570-000000000085 10202 1727204072.47716: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204072.47793: no more pending results, returning what we have 10202 1727204072.47796: results queue empty 10202 1727204072.47798: checking for any_errors_fatal 10202 1727204072.47803: done checking for any_errors_fatal 10202 1727204072.47804: checking for max_fail_percentage 10202 1727204072.47806: done checking for max_fail_percentage 10202 1727204072.47807: checking to see if all hosts have failed and the running result is not ok 10202 1727204072.47808: done checking to see if all hosts have failed 10202 1727204072.47809: getting the remaining hosts for this loop 10202 1727204072.47811: done getting the remaining hosts for this loop 10202 1727204072.47815: getting the next task for host managed-node3 10202 1727204072.47822: done getting next task for host managed-node3 10202 1727204072.47828: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10202 1727204072.47833: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204072.47853: getting variables 10202 1727204072.47854: in VariableManager get_vars() 10202 1727204072.47895: Calling all_inventory to load vars for managed-node3 10202 1727204072.47898: Calling groups_inventory to load vars for managed-node3 10202 1727204072.47900: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204072.47910: Calling all_plugins_play to load vars for managed-node3 10202 1727204072.47913: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204072.47915: Calling groups_plugins_play to load vars for managed-node3 10202 1727204072.49456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204072.50740: done with get_vars() 10202 1727204072.50770: done getting variables 10202 1727204072.50822: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.045) 0:00:34.183 ***** 10202 1727204072.50852: entering _queue_task() for managed-node3/package 10202 1727204072.51147: worker is 1 (out of 1 available) 10202 1727204072.51163: exiting _queue_task() for managed-node3/package 10202 1727204072.51178: done queuing things up, now waiting for results queue to drain 10202 1727204072.51180: waiting for pending results... 10202 1727204072.51383: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10202 1727204072.51572: in run() - task 127b8e07-fff9-0b04-2570-000000000086 10202 1727204072.51577: variable 'ansible_search_path' from source: unknown 10202 1727204072.51582: variable 'ansible_search_path' from source: unknown 10202 1727204072.51584: calling self._execute() 10202 1727204072.52075: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.52079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.52082: variable 'omit' from source: magic vars 10202 1727204072.52118: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.52128: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204072.52423: variable 'network_state' from source: role '' defaults 10202 1727204072.52426: Evaluated conditional (network_state != {}): False 10202 1727204072.52429: when evaluation is False, skipping this task 10202 1727204072.52431: _execute() done 10202 1727204072.52434: dumping result to json 10202 1727204072.52436: done dumping result, returning 10202 1727204072.52438: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-0b04-2570-000000000086] 10202 1727204072.52440: sending task result for task 127b8e07-fff9-0b04-2570-000000000086 10202 1727204072.52516: done sending task result for task 127b8e07-fff9-0b04-2570-000000000086 10202 1727204072.52519: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204072.52573: no more pending results, returning what we have 10202 1727204072.52576: results queue empty 10202 1727204072.52577: checking for any_errors_fatal 10202 1727204072.52582: done checking for any_errors_fatal 10202 1727204072.52583: checking for max_fail_percentage 10202 1727204072.52585: done checking for max_fail_percentage 10202 1727204072.52586: checking to see if all hosts have failed and the running result is not ok 10202 1727204072.52587: done checking to see if all hosts have failed 10202 1727204072.52587: getting the remaining hosts for this loop 10202 1727204072.52589: done getting the remaining hosts for this loop 10202 1727204072.52592: getting the next task for host managed-node3 10202 1727204072.52599: done getting next task for host managed-node3 10202 1727204072.52603: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10202 1727204072.52607: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204072.52624: getting variables 10202 1727204072.52626: in VariableManager get_vars() 10202 1727204072.52704: Calling all_inventory to load vars for managed-node3 10202 1727204072.52707: Calling groups_inventory to load vars for managed-node3 10202 1727204072.52709: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204072.52718: Calling all_plugins_play to load vars for managed-node3 10202 1727204072.52720: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204072.52723: Calling groups_plugins_play to load vars for managed-node3 10202 1727204072.54655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204072.57194: done with get_vars() 10202 1727204072.57246: done getting variables 10202 1727204072.57339: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.065) 0:00:34.249 ***** 10202 1727204072.57398: entering _queue_task() for managed-node3/service 10202 1727204072.58088: worker is 1 (out of 1 available) 10202 1727204072.58104: exiting _queue_task() for managed-node3/service 10202 1727204072.58120: done queuing things up, now waiting for results queue to drain 10202 1727204072.58121: waiting for pending results... 10202 1727204072.58354: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10202 1727204072.58520: in run() - task 127b8e07-fff9-0b04-2570-000000000087 10202 1727204072.58528: variable 'ansible_search_path' from source: unknown 10202 1727204072.58599: variable 'ansible_search_path' from source: unknown 10202 1727204072.58604: calling self._execute() 10202 1727204072.58775: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.58780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.58783: variable 'omit' from source: magic vars 10202 1727204072.59343: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.59347: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204072.59518: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204072.59694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204072.63457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204072.63555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204072.63601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204072.63673: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204072.63676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204072.63874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.63879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.63883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.63885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.63899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.63949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.64056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.64060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.64062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.64067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.64263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.64268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.64272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.64274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.64276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.64484: variable 'network_connections' from source: task vars 10202 1727204072.64488: variable 'port2_profile' from source: play vars 10202 1727204072.64505: variable 'port2_profile' from source: play vars 10202 1727204072.64515: variable 'port1_profile' from source: play vars 10202 1727204072.64583: variable 'port1_profile' from source: play vars 10202 1727204072.64592: variable 'controller_profile' from source: play vars 10202 1727204072.64659: variable 'controller_profile' from source: play vars 10202 1727204072.64742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204072.65151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204072.65195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204072.65239: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204072.65294: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204072.65368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204072.65371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204072.65505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.65509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204072.65512: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204072.65772: variable 'network_connections' from source: task vars 10202 1727204072.65775: variable 'port2_profile' from source: play vars 10202 1727204072.65978: variable 'port2_profile' from source: play vars 10202 1727204072.65981: variable 'port1_profile' from source: play vars 10202 1727204072.66044: variable 'port1_profile' from source: play vars 10202 1727204072.66052: variable 'controller_profile' from source: play vars 10202 1727204072.66339: variable 'controller_profile' from source: play vars 10202 1727204072.66370: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10202 1727204072.66676: when evaluation is False, skipping this task 10202 1727204072.66679: _execute() done 10202 1727204072.66684: dumping result to json 10202 1727204072.66687: done dumping result, returning 10202 1727204072.66698: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-0b04-2570-000000000087] 10202 1727204072.66703: sending task result for task 127b8e07-fff9-0b04-2570-000000000087 10202 1727204072.66974: done sending task result for task 127b8e07-fff9-0b04-2570-000000000087 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10202 1727204072.67036: no more pending results, returning what we have 10202 1727204072.67039: results queue empty 10202 1727204072.67040: checking for any_errors_fatal 10202 1727204072.67048: done checking for any_errors_fatal 10202 1727204072.67049: checking for max_fail_percentage 10202 1727204072.67050: done checking for max_fail_percentage 10202 1727204072.67051: checking to see if all hosts have failed and the running result is not ok 10202 1727204072.67053: done checking to see if all hosts have failed 10202 1727204072.67053: getting the remaining hosts for this loop 10202 1727204072.67055: done getting the remaining hosts for this loop 10202 1727204072.67061: getting the next task for host managed-node3 10202 1727204072.67073: done getting next task for host managed-node3 10202 1727204072.67078: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10202 1727204072.67083: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204072.67109: getting variables 10202 1727204072.67111: in VariableManager get_vars() 10202 1727204072.67164: Calling all_inventory to load vars for managed-node3 10202 1727204072.67170: Calling groups_inventory to load vars for managed-node3 10202 1727204072.67173: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204072.67187: Calling all_plugins_play to load vars for managed-node3 10202 1727204072.67191: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204072.67195: Calling groups_plugins_play to load vars for managed-node3 10202 1727204072.68079: WORKER PROCESS EXITING 10202 1727204072.72523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204072.76701: done with get_vars() 10202 1727204072.76859: done getting variables 10202 1727204072.76932: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.196) 0:00:34.446 ***** 10202 1727204072.77089: entering _queue_task() for managed-node3/service 10202 1727204072.77553: worker is 1 (out of 1 available) 10202 1727204072.77570: exiting _queue_task() for managed-node3/service 10202 1727204072.77586: done queuing things up, now waiting for results queue to drain 10202 1727204072.77587: waiting for pending results... 10202 1727204072.77976: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10202 1727204072.78149: in run() - task 127b8e07-fff9-0b04-2570-000000000088 10202 1727204072.78217: variable 'ansible_search_path' from source: unknown 10202 1727204072.78221: variable 'ansible_search_path' from source: unknown 10202 1727204072.78225: calling self._execute() 10202 1727204072.78382: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.78386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.78389: variable 'omit' from source: magic vars 10202 1727204072.79775: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.79779: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204072.80000: variable 'network_provider' from source: set_fact 10202 1727204072.80005: variable 'network_state' from source: role '' defaults 10202 1727204072.80076: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10202 1727204072.80080: variable 'omit' from source: magic vars 10202 1727204072.80226: variable 'omit' from source: magic vars 10202 1727204072.80263: variable 'network_service_name' from source: role '' defaults 10202 1727204072.80471: variable 'network_service_name' from source: role '' defaults 10202 1727204072.80733: variable '__network_provider_setup' from source: role '' defaults 10202 1727204072.80738: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204072.80826: variable '__network_service_name_default_nm' from source: role '' defaults 10202 1727204072.80849: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204072.80917: variable '__network_packages_default_nm' from source: role '' defaults 10202 1727204072.81215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204072.85115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204072.85330: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204072.85369: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204072.85533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204072.85625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204072.85894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.85898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.85901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.85969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.86005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.86169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.86172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.86176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.86179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.86198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.86483: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10202 1727204072.86709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.86713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.86715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.86730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.86743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.86869: variable 'ansible_python' from source: facts 10202 1727204072.86970: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10202 1727204072.86992: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204072.87084: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204072.87300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.87304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.87307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.87361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.87372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.87431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204072.87458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204072.87570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.87574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204072.87578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204072.87749: variable 'network_connections' from source: task vars 10202 1727204072.87763: variable 'port2_profile' from source: play vars 10202 1727204072.87955: variable 'port2_profile' from source: play vars 10202 1727204072.87958: variable 'port1_profile' from source: play vars 10202 1727204072.87989: variable 'port1_profile' from source: play vars 10202 1727204072.88001: variable 'controller_profile' from source: play vars 10202 1727204072.88574: variable 'controller_profile' from source: play vars 10202 1727204072.88578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204072.88581: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204072.88583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204072.88609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204072.88662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204072.88752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204072.88789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204072.88837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204072.88883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204072.88958: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204072.89312: variable 'network_connections' from source: task vars 10202 1727204072.89325: variable 'port2_profile' from source: play vars 10202 1727204072.89421: variable 'port2_profile' from source: play vars 10202 1727204072.89439: variable 'port1_profile' from source: play vars 10202 1727204072.89541: variable 'port1_profile' from source: play vars 10202 1727204072.89568: variable 'controller_profile' from source: play vars 10202 1727204072.89673: variable 'controller_profile' from source: play vars 10202 1727204072.89699: variable '__network_packages_default_wireless' from source: role '' defaults 10202 1727204072.89800: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204072.90571: variable 'network_connections' from source: task vars 10202 1727204072.90575: variable 'port2_profile' from source: play vars 10202 1727204072.90719: variable 'port2_profile' from source: play vars 10202 1727204072.90735: variable 'port1_profile' from source: play vars 10202 1727204072.90905: variable 'port1_profile' from source: play vars 10202 1727204072.90936: variable 'controller_profile' from source: play vars 10202 1727204072.91145: variable 'controller_profile' from source: play vars 10202 1727204072.91244: variable '__network_packages_default_team' from source: role '' defaults 10202 1727204072.91374: variable '__network_team_connections_defined' from source: role '' defaults 10202 1727204072.92200: variable 'network_connections' from source: task vars 10202 1727204072.92212: variable 'port2_profile' from source: play vars 10202 1727204072.92304: variable 'port2_profile' from source: play vars 10202 1727204072.92343: variable 'port1_profile' from source: play vars 10202 1727204072.92455: variable 'port1_profile' from source: play vars 10202 1727204072.92458: variable 'controller_profile' from source: play vars 10202 1727204072.92533: variable 'controller_profile' from source: play vars 10202 1727204072.92609: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204072.92688: variable '__network_service_name_default_initscripts' from source: role '' defaults 10202 1727204072.92727: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204072.92775: variable '__network_packages_default_initscripts' from source: role '' defaults 10202 1727204072.93045: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10202 1727204072.93726: variable 'network_connections' from source: task vars 10202 1727204072.93750: variable 'port2_profile' from source: play vars 10202 1727204072.93859: variable 'port2_profile' from source: play vars 10202 1727204072.93864: variable 'port1_profile' from source: play vars 10202 1727204072.93926: variable 'port1_profile' from source: play vars 10202 1727204072.93930: variable 'controller_profile' from source: play vars 10202 1727204072.94000: variable 'controller_profile' from source: play vars 10202 1727204072.94035: variable 'ansible_distribution' from source: facts 10202 1727204072.94038: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.94041: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.94057: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10202 1727204072.94291: variable 'ansible_distribution' from source: facts 10202 1727204072.94295: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.94305: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.94470: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10202 1727204072.95084: variable 'ansible_distribution' from source: facts 10202 1727204072.95087: variable '__network_rh_distros' from source: role '' defaults 10202 1727204072.95090: variable 'ansible_distribution_major_version' from source: facts 10202 1727204072.95092: variable 'network_provider' from source: set_fact 10202 1727204072.95173: variable 'omit' from source: magic vars 10202 1727204072.95177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204072.95180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204072.95392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204072.95396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204072.95398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204072.95400: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204072.95403: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.95477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.95711: Set connection var ansible_shell_type to sh 10202 1727204072.95717: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204072.95725: Set connection var ansible_connection to ssh 10202 1727204072.95730: Set connection var ansible_shell_executable to /bin/sh 10202 1727204072.95791: Set connection var ansible_pipelining to False 10202 1727204072.95798: Set connection var ansible_timeout to 10 10202 1727204072.95832: variable 'ansible_shell_executable' from source: unknown 10202 1727204072.95835: variable 'ansible_connection' from source: unknown 10202 1727204072.95838: variable 'ansible_module_compression' from source: unknown 10202 1727204072.95841: variable 'ansible_shell_type' from source: unknown 10202 1727204072.95843: variable 'ansible_shell_executable' from source: unknown 10202 1727204072.95926: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204072.95932: variable 'ansible_pipelining' from source: unknown 10202 1727204072.95936: variable 'ansible_timeout' from source: unknown 10202 1727204072.95938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204072.96390: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204072.96394: variable 'omit' from source: magic vars 10202 1727204072.96396: starting attempt loop 10202 1727204072.96398: running the handler 10202 1727204072.96400: variable 'ansible_facts' from source: unknown 10202 1727204072.97502: _low_level_execute_command(): starting 10202 1727204072.97506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204072.98393: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204072.98399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204072.98511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204072.98515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204072.98596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.00475: stdout chunk (state=3): >>>/root <<< 10202 1727204073.00712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204073.00763: stderr chunk (state=3): >>><<< 10202 1727204073.00825: stdout chunk (state=3): >>><<< 10202 1727204073.00959: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204073.00963: _low_level_execute_command(): starting 10202 1727204073.00969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852 `" && echo ansible-tmp-1727204073.0086308-12107-228717008707852="` echo /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852 `" ) && sleep 0' 10202 1727204073.01800: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204073.01825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204073.01861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204073.01948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204073.02000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204073.02052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204073.02077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204073.02185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.04501: stdout chunk (state=3): >>>ansible-tmp-1727204073.0086308-12107-228717008707852=/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852 <<< 10202 1727204073.04621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204073.04769: stderr chunk (state=3): >>><<< 10202 1727204073.04773: stdout chunk (state=3): >>><<< 10202 1727204073.04790: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204073.0086308-12107-228717008707852=/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204073.05080: variable 'ansible_module_compression' from source: unknown 10202 1727204073.05084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10202 1727204073.05086: variable 'ansible_facts' from source: unknown 10202 1727204073.05643: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py 10202 1727204073.05837: Sending initial data 10202 1727204073.05852: Sent initial data (156 bytes) 10202 1727204073.06602: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204073.06680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204073.06740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204073.06793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204073.06871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.08695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204073.08888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204073.08893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py" <<< 10202 1727204073.08936: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp0mem6s3h /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py <<< 10202 1727204073.08952: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp0mem6s3h" to remote "/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py" <<< 10202 1727204073.11287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204073.11532: stderr chunk (state=3): >>><<< 10202 1727204073.11536: stdout chunk (state=3): >>><<< 10202 1727204073.11538: done transferring module to remote 10202 1727204073.11540: _low_level_execute_command(): starting 10202 1727204073.11543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/ /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py && sleep 0' 10202 1727204073.12112: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204073.12133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204073.12188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204073.12192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204073.12283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.14327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204073.14414: stderr chunk (state=3): >>><<< 10202 1727204073.14418: stdout chunk (state=3): >>><<< 10202 1727204073.14421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204073.14424: _low_level_execute_command(): starting 10202 1727204073.14427: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/AnsiballZ_systemd.py && sleep 0' 10202 1727204073.15117: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204073.15123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204073.15131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204073.15148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204073.15219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204073.15232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204073.15235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204073.15305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.49127: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11759616", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3512942592", "CPUUsageNSec": "939453000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCOR<<< 10202 1727204073.49160: stdout chunk (state=3): >>>E": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.service network.target multi-user.target cloud-init.service", "After": "system.slice cloud-init-local.service basic.target dbus-broker.service network-pre.target systemd-journald.socket dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:23 EDT", "StateChangeTimestampMonotonic": "340960243", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10202 1727204073.51692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204073.51696: stdout chunk (state=3): >>><<< 10202 1727204073.51698: stderr chunk (state=3): >>><<< 10202 1727204073.51729: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "670", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ExecMainStartTimestampMonotonic": "32994154", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "670", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3507", "MemoryCurrent": "11759616", "MemoryPeak": "13684736", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3512942592", "CPUUsageNSec": "939453000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.service network.target multi-user.target cloud-init.service", "After": "system.slice cloud-init-local.service basic.target dbus-broker.service network-pre.target systemd-journald.socket dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:23 EDT", "StateChangeTimestampMonotonic": "340960243", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:15 EDT", "InactiveExitTimestampMonotonic": "32994691", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:17 EDT", "ActiveEnterTimestampMonotonic": "34735054", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:15 EDT", "ConditionTimestampMonotonic": "32982961", "AssertTimestamp": "Tue 2024-09-24 14:48:15 EDT", "AssertTimestampMonotonic": "32982965", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4c25d2827e7b45838bcc13e108827a7f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204073.52000: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204073.52003: _low_level_execute_command(): starting 10202 1727204073.52006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204073.0086308-12107-228717008707852/ > /dev/null 2>&1 && sleep 0' 10202 1727204073.52868: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204073.52889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204073.52901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204073.52917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204073.52939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204073.53039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204073.55273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204073.55277: stdout chunk (state=3): >>><<< 10202 1727204073.55279: stderr chunk (state=3): >>><<< 10202 1727204073.55282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204073.55285: handler run complete 10202 1727204073.55335: attempt loop complete, returning result 10202 1727204073.55338: _execute() done 10202 1727204073.55341: dumping result to json 10202 1727204073.55369: done dumping result, returning 10202 1727204073.55420: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-0b04-2570-000000000088] 10202 1727204073.55423: sending task result for task 127b8e07-fff9-0b04-2570-000000000088 10202 1727204073.56371: done sending task result for task 127b8e07-fff9-0b04-2570-000000000088 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204073.56421: WORKER PROCESS EXITING 10202 1727204073.56443: no more pending results, returning what we have 10202 1727204073.56446: results queue empty 10202 1727204073.56447: checking for any_errors_fatal 10202 1727204073.56453: done checking for any_errors_fatal 10202 1727204073.56454: checking for max_fail_percentage 10202 1727204073.56455: done checking for max_fail_percentage 10202 1727204073.56456: checking to see if all hosts have failed and the running result is not ok 10202 1727204073.56462: done checking to see if all hosts have failed 10202 1727204073.56463: getting the remaining hosts for this loop 10202 1727204073.56464: done getting the remaining hosts for this loop 10202 1727204073.56475: getting the next task for host managed-node3 10202 1727204073.56482: done getting next task for host managed-node3 10202 1727204073.56486: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10202 1727204073.56490: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204073.56503: getting variables 10202 1727204073.56505: in VariableManager get_vars() 10202 1727204073.56546: Calling all_inventory to load vars for managed-node3 10202 1727204073.56549: Calling groups_inventory to load vars for managed-node3 10202 1727204073.56552: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204073.56563: Calling all_plugins_play to load vars for managed-node3 10202 1727204073.56573: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204073.56582: Calling groups_plugins_play to load vars for managed-node3 10202 1727204073.58520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204073.61096: done with get_vars() 10202 1727204073.61131: done getting variables 10202 1727204073.61227: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.841) 0:00:35.288 ***** 10202 1727204073.61281: entering _queue_task() for managed-node3/service 10202 1727204073.61895: worker is 1 (out of 1 available) 10202 1727204073.61916: exiting _queue_task() for managed-node3/service 10202 1727204073.61933: done queuing things up, now waiting for results queue to drain 10202 1727204073.61934: waiting for pending results... 10202 1727204073.62292: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10202 1727204073.62440: in run() - task 127b8e07-fff9-0b04-2570-000000000089 10202 1727204073.62497: variable 'ansible_search_path' from source: unknown 10202 1727204073.62505: variable 'ansible_search_path' from source: unknown 10202 1727204073.62552: calling self._execute() 10202 1727204073.62739: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204073.62752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204073.62771: variable 'omit' from source: magic vars 10202 1727204073.63380: variable 'ansible_distribution_major_version' from source: facts 10202 1727204073.63423: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204073.63778: variable 'network_provider' from source: set_fact 10202 1727204073.63782: Evaluated conditional (network_provider == "nm"): True 10202 1727204073.63784: variable '__network_wpa_supplicant_required' from source: role '' defaults 10202 1727204073.63869: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10202 1727204073.64106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204073.66708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204073.66793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204073.66841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204073.66890: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204073.66924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204073.67033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204073.67074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204073.67107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204073.67155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204073.67180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204073.67236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204073.67260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204073.67287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204073.67328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204073.67345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204073.67389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204073.67413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204073.67440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204073.67680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204073.67684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204073.67768: variable 'network_connections' from source: task vars 10202 1727204073.67789: variable 'port2_profile' from source: play vars 10202 1727204073.67874: variable 'port2_profile' from source: play vars 10202 1727204073.67893: variable 'port1_profile' from source: play vars 10202 1727204073.67967: variable 'port1_profile' from source: play vars 10202 1727204073.67982: variable 'controller_profile' from source: play vars 10202 1727204073.68103: variable 'controller_profile' from source: play vars 10202 1727204073.68191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1727204073.68389: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1727204073.68435: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1727204073.68475: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1727204073.68511: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1727204073.68570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1727204073.68601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1727204073.68633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204073.68668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1727204073.68728: variable '__network_wireless_connections_defined' from source: role '' defaults 10202 1727204073.69056: variable 'network_connections' from source: task vars 10202 1727204073.69071: variable 'port2_profile' from source: play vars 10202 1727204073.69141: variable 'port2_profile' from source: play vars 10202 1727204073.69155: variable 'port1_profile' from source: play vars 10202 1727204073.69219: variable 'port1_profile' from source: play vars 10202 1727204073.69470: variable 'controller_profile' from source: play vars 10202 1727204073.69474: variable 'controller_profile' from source: play vars 10202 1727204073.69477: Evaluated conditional (__network_wpa_supplicant_required): False 10202 1727204073.69479: when evaluation is False, skipping this task 10202 1727204073.69481: _execute() done 10202 1727204073.69483: dumping result to json 10202 1727204073.69485: done dumping result, returning 10202 1727204073.69487: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-0b04-2570-000000000089] 10202 1727204073.69489: sending task result for task 127b8e07-fff9-0b04-2570-000000000089 10202 1727204073.69572: done sending task result for task 127b8e07-fff9-0b04-2570-000000000089 10202 1727204073.69576: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10202 1727204073.69720: no more pending results, returning what we have 10202 1727204073.69725: results queue empty 10202 1727204073.69726: checking for any_errors_fatal 10202 1727204073.69754: done checking for any_errors_fatal 10202 1727204073.69756: checking for max_fail_percentage 10202 1727204073.69757: done checking for max_fail_percentage 10202 1727204073.69758: checking to see if all hosts have failed and the running result is not ok 10202 1727204073.69759: done checking to see if all hosts have failed 10202 1727204073.69760: getting the remaining hosts for this loop 10202 1727204073.69762: done getting the remaining hosts for this loop 10202 1727204073.69768: getting the next task for host managed-node3 10202 1727204073.69775: done getting next task for host managed-node3 10202 1727204073.69780: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10202 1727204073.69787: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204073.69822: getting variables 10202 1727204073.69824: in VariableManager get_vars() 10202 1727204073.70385: Calling all_inventory to load vars for managed-node3 10202 1727204073.70390: Calling groups_inventory to load vars for managed-node3 10202 1727204073.70393: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204073.70405: Calling all_plugins_play to load vars for managed-node3 10202 1727204073.70464: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204073.70473: Calling groups_plugins_play to load vars for managed-node3 10202 1727204073.73703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204073.76629: done with get_vars() 10202 1727204073.76671: done getting variables 10202 1727204073.76745: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.155) 0:00:35.443 ***** 10202 1727204073.76790: entering _queue_task() for managed-node3/service 10202 1727204073.77206: worker is 1 (out of 1 available) 10202 1727204073.77336: exiting _queue_task() for managed-node3/service 10202 1727204073.77355: done queuing things up, now waiting for results queue to drain 10202 1727204073.77356: waiting for pending results... 10202 1727204073.77626: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 10202 1727204073.78104: in run() - task 127b8e07-fff9-0b04-2570-00000000008a 10202 1727204073.78110: variable 'ansible_search_path' from source: unknown 10202 1727204073.78113: variable 'ansible_search_path' from source: unknown 10202 1727204073.78227: calling self._execute() 10202 1727204073.78456: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204073.78566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204073.78571: variable 'omit' from source: magic vars 10202 1727204073.79821: variable 'ansible_distribution_major_version' from source: facts 10202 1727204073.79825: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204073.80054: variable 'network_provider' from source: set_fact 10202 1727204073.80478: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204073.80483: when evaluation is False, skipping this task 10202 1727204073.80485: _execute() done 10202 1727204073.80488: dumping result to json 10202 1727204073.80491: done dumping result, returning 10202 1727204073.80495: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-0b04-2570-00000000008a] 10202 1727204073.80497: sending task result for task 127b8e07-fff9-0b04-2570-00000000008a skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10202 1727204073.80632: no more pending results, returning what we have 10202 1727204073.80635: results queue empty 10202 1727204073.80636: checking for any_errors_fatal 10202 1727204073.80646: done checking for any_errors_fatal 10202 1727204073.80647: checking for max_fail_percentage 10202 1727204073.80649: done checking for max_fail_percentage 10202 1727204073.80650: checking to see if all hosts have failed and the running result is not ok 10202 1727204073.80651: done checking to see if all hosts have failed 10202 1727204073.80652: getting the remaining hosts for this loop 10202 1727204073.80654: done getting the remaining hosts for this loop 10202 1727204073.80658: getting the next task for host managed-node3 10202 1727204073.80668: done getting next task for host managed-node3 10202 1727204073.80672: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10202 1727204073.80677: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204073.80701: getting variables 10202 1727204073.80703: in VariableManager get_vars() 10202 1727204073.80749: Calling all_inventory to load vars for managed-node3 10202 1727204073.80752: Calling groups_inventory to load vars for managed-node3 10202 1727204073.80754: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204073.80854: Calling all_plugins_play to load vars for managed-node3 10202 1727204073.80859: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204073.80868: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008a 10202 1727204073.80871: WORKER PROCESS EXITING 10202 1727204073.80875: Calling groups_plugins_play to load vars for managed-node3 10202 1727204073.84083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204073.86476: done with get_vars() 10202 1727204073.86517: done getting variables 10202 1727204073.86616: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.098) 0:00:35.541 ***** 10202 1727204073.86656: entering _queue_task() for managed-node3/copy 10202 1727204073.87272: worker is 1 (out of 1 available) 10202 1727204073.87284: exiting _queue_task() for managed-node3/copy 10202 1727204073.87296: done queuing things up, now waiting for results queue to drain 10202 1727204073.87298: waiting for pending results... 10202 1727204073.87581: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10202 1727204073.87894: in run() - task 127b8e07-fff9-0b04-2570-00000000008b 10202 1727204073.88072: variable 'ansible_search_path' from source: unknown 10202 1727204073.88078: variable 'ansible_search_path' from source: unknown 10202 1727204073.88081: calling self._execute() 10202 1727204073.88327: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204073.88402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204073.88407: variable 'omit' from source: magic vars 10202 1727204073.89394: variable 'ansible_distribution_major_version' from source: facts 10202 1727204073.89399: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204073.89543: variable 'network_provider' from source: set_fact 10202 1727204073.89556: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204073.89611: when evaluation is False, skipping this task 10202 1727204073.89615: _execute() done 10202 1727204073.89623: dumping result to json 10202 1727204073.89629: done dumping result, returning 10202 1727204073.89633: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-0b04-2570-00000000008b] 10202 1727204073.89635: sending task result for task 127b8e07-fff9-0b04-2570-00000000008b skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10202 1727204073.89993: no more pending results, returning what we have 10202 1727204073.89997: results queue empty 10202 1727204073.89999: checking for any_errors_fatal 10202 1727204073.90007: done checking for any_errors_fatal 10202 1727204073.90008: checking for max_fail_percentage 10202 1727204073.90011: done checking for max_fail_percentage 10202 1727204073.90012: checking to see if all hosts have failed and the running result is not ok 10202 1727204073.90013: done checking to see if all hosts have failed 10202 1727204073.90014: getting the remaining hosts for this loop 10202 1727204073.90015: done getting the remaining hosts for this loop 10202 1727204073.90019: getting the next task for host managed-node3 10202 1727204073.90027: done getting next task for host managed-node3 10202 1727204073.90031: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10202 1727204073.90036: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204073.90058: getting variables 10202 1727204073.90060: in VariableManager get_vars() 10202 1727204073.90108: Calling all_inventory to load vars for managed-node3 10202 1727204073.90111: Calling groups_inventory to load vars for managed-node3 10202 1727204073.90114: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204073.90126: Calling all_plugins_play to load vars for managed-node3 10202 1727204073.90129: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204073.90132: Calling groups_plugins_play to load vars for managed-node3 10202 1727204073.90731: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008b 10202 1727204073.90736: WORKER PROCESS EXITING 10202 1727204073.92326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204073.95600: done with get_vars() 10202 1727204073.95632: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.090) 0:00:35.632 ***** 10202 1727204073.95740: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10202 1727204073.96155: worker is 1 (out of 1 available) 10202 1727204073.96315: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10202 1727204073.96328: done queuing things up, now waiting for results queue to drain 10202 1727204073.96330: waiting for pending results... 10202 1727204073.96519: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10202 1727204073.96735: in run() - task 127b8e07-fff9-0b04-2570-00000000008c 10202 1727204073.96888: variable 'ansible_search_path' from source: unknown 10202 1727204073.96941: variable 'ansible_search_path' from source: unknown 10202 1727204073.97000: calling self._execute() 10202 1727204073.97125: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204073.97143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204073.97185: variable 'omit' from source: magic vars 10202 1727204073.97646: variable 'ansible_distribution_major_version' from source: facts 10202 1727204073.97663: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204073.97730: variable 'omit' from source: magic vars 10202 1727204073.97774: variable 'omit' from source: magic vars 10202 1727204073.97984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10202 1727204074.00631: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10202 1727204074.00723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10202 1727204074.00786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10202 1727204074.00823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10202 1727204074.00883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10202 1727204074.00960: variable 'network_provider' from source: set_fact 10202 1727204074.01138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10202 1727204074.01193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10202 1727204074.01316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10202 1727204074.01320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10202 1727204074.01325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10202 1727204074.01400: variable 'omit' from source: magic vars 10202 1727204074.01537: variable 'omit' from source: magic vars 10202 1727204074.01675: variable 'network_connections' from source: task vars 10202 1727204074.01691: variable 'port2_profile' from source: play vars 10202 1727204074.01753: variable 'port2_profile' from source: play vars 10202 1727204074.01780: variable 'port1_profile' from source: play vars 10202 1727204074.01875: variable 'port1_profile' from source: play vars 10202 1727204074.01878: variable 'controller_profile' from source: play vars 10202 1727204074.01924: variable 'controller_profile' from source: play vars 10202 1727204074.02112: variable 'omit' from source: magic vars 10202 1727204074.02125: variable '__lsr_ansible_managed' from source: task vars 10202 1727204074.02203: variable '__lsr_ansible_managed' from source: task vars 10202 1727204074.02407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10202 1727204074.02748: Loaded config def from plugin (lookup/template) 10202 1727204074.02751: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10202 1727204074.02753: File lookup term: get_ansible_managed.j2 10202 1727204074.02755: variable 'ansible_search_path' from source: unknown 10202 1727204074.02758: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10202 1727204074.02762: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10202 1727204074.02783: variable 'ansible_search_path' from source: unknown 10202 1727204074.13364: variable 'ansible_managed' from source: unknown 10202 1727204074.13570: variable 'omit' from source: magic vars 10202 1727204074.13669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204074.13679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204074.13683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204074.13695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204074.13709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204074.13742: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204074.13749: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204074.13757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204074.13871: Set connection var ansible_shell_type to sh 10202 1727204074.13888: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204074.13904: Set connection var ansible_connection to ssh 10202 1727204074.13914: Set connection var ansible_shell_executable to /bin/sh 10202 1727204074.13923: Set connection var ansible_pipelining to False 10202 1727204074.13932: Set connection var ansible_timeout to 10 10202 1727204074.13971: variable 'ansible_shell_executable' from source: unknown 10202 1727204074.13974: variable 'ansible_connection' from source: unknown 10202 1727204074.13989: variable 'ansible_module_compression' from source: unknown 10202 1727204074.13992: variable 'ansible_shell_type' from source: unknown 10202 1727204074.13994: variable 'ansible_shell_executable' from source: unknown 10202 1727204074.14071: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204074.14074: variable 'ansible_pipelining' from source: unknown 10202 1727204074.14077: variable 'ansible_timeout' from source: unknown 10202 1727204074.14087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204074.14197: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204074.14228: variable 'omit' from source: magic vars 10202 1727204074.14242: starting attempt loop 10202 1727204074.14250: running the handler 10202 1727204074.14273: _low_level_execute_command(): starting 10202 1727204074.14284: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204074.15100: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204074.15143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10202 1727204074.15155: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204074.15213: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.15271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204074.15289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204074.15547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204074.15644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204074.17514: stdout chunk (state=3): >>>/root <<< 10202 1727204074.17646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204074.17800: stderr chunk (state=3): >>><<< 10202 1727204074.17804: stdout chunk (state=3): >>><<< 10202 1727204074.17983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204074.17989: _low_level_execute_command(): starting 10202 1727204074.17992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853 `" && echo ansible-tmp-1727204074.1788256-12157-111965331949853="` echo /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853 `" ) && sleep 0' 10202 1727204074.19310: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.19390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204074.19399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204074.19411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204074.19421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204074.19424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204074.19442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204074.19449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204074.19521: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204074.19545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204074.19700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204074.19791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204074.21970: stdout chunk (state=3): >>>ansible-tmp-1727204074.1788256-12157-111965331949853=/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853 <<< 10202 1727204074.22087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204074.22150: stderr chunk (state=3): >>><<< 10202 1727204074.22154: stdout chunk (state=3): >>><<< 10202 1727204074.22168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.1788256-12157-111965331949853=/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204074.22211: variable 'ansible_module_compression' from source: unknown 10202 1727204074.22256: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10202 1727204074.22305: variable 'ansible_facts' from source: unknown 10202 1727204074.22398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py 10202 1727204074.22512: Sending initial data 10202 1727204074.22515: Sent initial data (168 bytes) 10202 1727204074.23130: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204074.23135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.23217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204074.23221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204074.23299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204074.25070: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204074.25130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204074.25216: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpb3ckxn4g /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py <<< 10202 1727204074.25219: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py" <<< 10202 1727204074.25278: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpb3ckxn4g" to remote "/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py" <<< 10202 1727204074.26692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204074.26786: stderr chunk (state=3): >>><<< 10202 1727204074.26839: stdout chunk (state=3): >>><<< 10202 1727204074.26884: done transferring module to remote 10202 1727204074.26964: _low_level_execute_command(): starting 10202 1727204074.26975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/ /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py && sleep 0' 10202 1727204074.27623: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.27628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.27693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204074.27702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204074.27788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204074.30089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204074.30094: stderr chunk (state=3): >>><<< 10202 1727204074.30114: stdout chunk (state=3): >>><<< 10202 1727204074.30140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204074.30147: _low_level_execute_command(): starting 10202 1727204074.30154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/AnsiballZ_network_connections.py && sleep 0' 10202 1727204074.30867: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204074.30957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204074.31011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204074.31113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204075.04863: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c296b684-3111-41df-a255-0a2b3f77cf01: error=unknown <<< 10202 1727204075.12388: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 10202 1727204075.12460: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e3635d21-1c21-4d1f-ab3f-feb9091d21d0: error=unknown <<< 10202 1727204075.14842: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/f32dee73-d17d-466b-80a4-4a2bab216d3b: error=unknown <<< 10202 1727204075.15078: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10202 1727204075.17772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204075.17776: stdout chunk (state=3): >>><<< 10202 1727204075.17779: stderr chunk (state=3): >>><<< 10202 1727204075.17782: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c296b684-3111-41df-a255-0a2b3f77cf01: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e3635d21-1c21-4d1f-ab3f-feb9091d21d0: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_13b8kzm6/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/f32dee73-d17d-466b-80a4-4a2bab216d3b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204075.17790: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204075.17793: _low_level_execute_command(): starting 10202 1727204075.17795: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.1788256-12157-111965331949853/ > /dev/null 2>&1 && sleep 0' 10202 1727204075.18570: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204075.18576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204075.18587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204075.18601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204075.18612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204075.18619: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204075.18632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204075.18647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204075.18654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204075.18661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204075.18835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204075.18839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204075.18842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204075.18844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204075.18846: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204075.18848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204075.18851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204075.18853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204075.19083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204075.21287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204075.21293: stdout chunk (state=3): >>><<< 10202 1727204075.21297: stderr chunk (state=3): >>><<< 10202 1727204075.21373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204075.21377: handler run complete 10202 1727204075.21380: attempt loop complete, returning result 10202 1727204075.21382: _execute() done 10202 1727204075.21384: dumping result to json 10202 1727204075.21390: done dumping result, returning 10202 1727204075.21393: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-0b04-2570-00000000008c] 10202 1727204075.21395: sending task result for task 127b8e07-fff9-0b04-2570-00000000008c 10202 1727204075.21748: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008c 10202 1727204075.21754: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 10202 1727204075.21884: no more pending results, returning what we have 10202 1727204075.21887: results queue empty 10202 1727204075.21889: checking for any_errors_fatal 10202 1727204075.21896: done checking for any_errors_fatal 10202 1727204075.21897: checking for max_fail_percentage 10202 1727204075.21899: done checking for max_fail_percentage 10202 1727204075.21900: checking to see if all hosts have failed and the running result is not ok 10202 1727204075.21901: done checking to see if all hosts have failed 10202 1727204075.21901: getting the remaining hosts for this loop 10202 1727204075.21903: done getting the remaining hosts for this loop 10202 1727204075.21908: getting the next task for host managed-node3 10202 1727204075.21915: done getting next task for host managed-node3 10202 1727204075.21919: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10202 1727204075.21923: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204075.21940: getting variables 10202 1727204075.21942: in VariableManager get_vars() 10202 1727204075.22444: Calling all_inventory to load vars for managed-node3 10202 1727204075.22448: Calling groups_inventory to load vars for managed-node3 10202 1727204075.22451: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204075.22462: Calling all_plugins_play to load vars for managed-node3 10202 1727204075.22467: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204075.22471: Calling groups_plugins_play to load vars for managed-node3 10202 1727204075.26732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204075.32307: done with get_vars() 10202 1727204075.32352: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:35 -0400 (0:00:01.368) 0:00:37.000 ***** 10202 1727204075.32550: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10202 1727204075.33508: worker is 1 (out of 1 available) 10202 1727204075.33526: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10202 1727204075.33544: done queuing things up, now waiting for results queue to drain 10202 1727204075.33545: waiting for pending results... 10202 1727204075.34041: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 10202 1727204075.34268: in run() - task 127b8e07-fff9-0b04-2570-00000000008d 10202 1727204075.34276: variable 'ansible_search_path' from source: unknown 10202 1727204075.34280: variable 'ansible_search_path' from source: unknown 10202 1727204075.34463: calling self._execute() 10202 1727204075.34586: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.34590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.34673: variable 'omit' from source: magic vars 10202 1727204075.35821: variable 'ansible_distribution_major_version' from source: facts 10202 1727204075.35876: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204075.36132: variable 'network_state' from source: role '' defaults 10202 1727204075.36773: Evaluated conditional (network_state != {}): False 10202 1727204075.36776: when evaluation is False, skipping this task 10202 1727204075.36779: _execute() done 10202 1727204075.36782: dumping result to json 10202 1727204075.36784: done dumping result, returning 10202 1727204075.36787: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-0b04-2570-00000000008d] 10202 1727204075.36790: sending task result for task 127b8e07-fff9-0b04-2570-00000000008d 10202 1727204075.36868: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008d 10202 1727204075.36871: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10202 1727204075.36917: no more pending results, returning what we have 10202 1727204075.36920: results queue empty 10202 1727204075.36921: checking for any_errors_fatal 10202 1727204075.36932: done checking for any_errors_fatal 10202 1727204075.36933: checking for max_fail_percentage 10202 1727204075.36935: done checking for max_fail_percentage 10202 1727204075.36936: checking to see if all hosts have failed and the running result is not ok 10202 1727204075.36937: done checking to see if all hosts have failed 10202 1727204075.36938: getting the remaining hosts for this loop 10202 1727204075.36939: done getting the remaining hosts for this loop 10202 1727204075.36943: getting the next task for host managed-node3 10202 1727204075.36949: done getting next task for host managed-node3 10202 1727204075.36953: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10202 1727204075.36958: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204075.36979: getting variables 10202 1727204075.36980: in VariableManager get_vars() 10202 1727204075.37020: Calling all_inventory to load vars for managed-node3 10202 1727204075.37023: Calling groups_inventory to load vars for managed-node3 10202 1727204075.37025: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204075.37038: Calling all_plugins_play to load vars for managed-node3 10202 1727204075.37041: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204075.37044: Calling groups_plugins_play to load vars for managed-node3 10202 1727204075.41891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204075.46532: done with get_vars() 10202 1727204075.46626: done getting variables 10202 1727204075.46755: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.143) 0:00:37.144 ***** 10202 1727204075.46869: entering _queue_task() for managed-node3/debug 10202 1727204075.47754: worker is 1 (out of 1 available) 10202 1727204075.47772: exiting _queue_task() for managed-node3/debug 10202 1727204075.47785: done queuing things up, now waiting for results queue to drain 10202 1727204075.47786: waiting for pending results... 10202 1727204075.48357: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10202 1727204075.48800: in run() - task 127b8e07-fff9-0b04-2570-00000000008e 10202 1727204075.48805: variable 'ansible_search_path' from source: unknown 10202 1727204075.48808: variable 'ansible_search_path' from source: unknown 10202 1727204075.48965: calling self._execute() 10202 1727204075.49181: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.49195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.49199: variable 'omit' from source: magic vars 10202 1727204075.50095: variable 'ansible_distribution_major_version' from source: facts 10202 1727204075.50108: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204075.50124: variable 'omit' from source: magic vars 10202 1727204075.50320: variable 'omit' from source: magic vars 10202 1727204075.50565: variable 'omit' from source: magic vars 10202 1727204075.50595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204075.50636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204075.50661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204075.50888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.51115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.51118: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204075.51122: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.51125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.51188: Set connection var ansible_shell_type to sh 10202 1727204075.51283: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204075.51289: Set connection var ansible_connection to ssh 10202 1727204075.51296: Set connection var ansible_shell_executable to /bin/sh 10202 1727204075.51318: Set connection var ansible_pipelining to False 10202 1727204075.51331: Set connection var ansible_timeout to 10 10202 1727204075.51357: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.51360: variable 'ansible_connection' from source: unknown 10202 1727204075.51364: variable 'ansible_module_compression' from source: unknown 10202 1727204075.51371: variable 'ansible_shell_type' from source: unknown 10202 1727204075.51551: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.51554: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.51557: variable 'ansible_pipelining' from source: unknown 10202 1727204075.51559: variable 'ansible_timeout' from source: unknown 10202 1727204075.51561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.51880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204075.51892: variable 'omit' from source: magic vars 10202 1727204075.51898: starting attempt loop 10202 1727204075.51901: running the handler 10202 1727204075.52278: variable '__network_connections_result' from source: set_fact 10202 1727204075.52347: handler run complete 10202 1727204075.52370: attempt loop complete, returning result 10202 1727204075.52374: _execute() done 10202 1727204075.52376: dumping result to json 10202 1727204075.52379: done dumping result, returning 10202 1727204075.52506: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-0b04-2570-00000000008e] 10202 1727204075.52509: sending task result for task 127b8e07-fff9-0b04-2570-00000000008e 10202 1727204075.52618: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008e 10202 1727204075.52622: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 10202 1727204075.52699: no more pending results, returning what we have 10202 1727204075.52703: results queue empty 10202 1727204075.52704: checking for any_errors_fatal 10202 1727204075.52717: done checking for any_errors_fatal 10202 1727204075.52718: checking for max_fail_percentage 10202 1727204075.52720: done checking for max_fail_percentage 10202 1727204075.52721: checking to see if all hosts have failed and the running result is not ok 10202 1727204075.52823: done checking to see if all hosts have failed 10202 1727204075.52824: getting the remaining hosts for this loop 10202 1727204075.52826: done getting the remaining hosts for this loop 10202 1727204075.52833: getting the next task for host managed-node3 10202 1727204075.52841: done getting next task for host managed-node3 10202 1727204075.52846: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10202 1727204075.52851: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204075.52865: getting variables 10202 1727204075.52869: in VariableManager get_vars() 10202 1727204075.52915: Calling all_inventory to load vars for managed-node3 10202 1727204075.52919: Calling groups_inventory to load vars for managed-node3 10202 1727204075.52922: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204075.52934: Calling all_plugins_play to load vars for managed-node3 10202 1727204075.52937: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204075.52941: Calling groups_plugins_play to load vars for managed-node3 10202 1727204075.57741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204075.65255: done with get_vars() 10202 1727204075.65307: done getting variables 10202 1727204075.65608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.187) 0:00:37.331 ***** 10202 1727204075.65653: entering _queue_task() for managed-node3/debug 10202 1727204075.67108: worker is 1 (out of 1 available) 10202 1727204075.67123: exiting _queue_task() for managed-node3/debug 10202 1727204075.67134: done queuing things up, now waiting for results queue to drain 10202 1727204075.67136: waiting for pending results... 10202 1727204075.67699: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10202 1727204075.67889: in run() - task 127b8e07-fff9-0b04-2570-00000000008f 10202 1727204075.67920: variable 'ansible_search_path' from source: unknown 10202 1727204075.68073: variable 'ansible_search_path' from source: unknown 10202 1727204075.68077: calling self._execute() 10202 1727204075.68338: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.68342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.68345: variable 'omit' from source: magic vars 10202 1727204075.69322: variable 'ansible_distribution_major_version' from source: facts 10202 1727204075.69350: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204075.69499: variable 'omit' from source: magic vars 10202 1727204075.69755: variable 'omit' from source: magic vars 10202 1727204075.69758: variable 'omit' from source: magic vars 10202 1727204075.69893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204075.69943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204075.69997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204075.70098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.70118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.70274: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204075.70278: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.70281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.70502: Set connection var ansible_shell_type to sh 10202 1727204075.70576: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204075.70590: Set connection var ansible_connection to ssh 10202 1727204075.70601: Set connection var ansible_shell_executable to /bin/sh 10202 1727204075.70613: Set connection var ansible_pipelining to False 10202 1727204075.70636: Set connection var ansible_timeout to 10 10202 1727204075.70677: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.70743: variable 'ansible_connection' from source: unknown 10202 1727204075.70753: variable 'ansible_module_compression' from source: unknown 10202 1727204075.70760: variable 'ansible_shell_type' from source: unknown 10202 1727204075.70961: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.70967: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.70970: variable 'ansible_pipelining' from source: unknown 10202 1727204075.70973: variable 'ansible_timeout' from source: unknown 10202 1727204075.70975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.71375: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204075.71388: variable 'omit' from source: magic vars 10202 1727204075.71396: starting attempt loop 10202 1727204075.71573: running the handler 10202 1727204075.71577: variable '__network_connections_result' from source: set_fact 10202 1727204075.71748: variable '__network_connections_result' from source: set_fact 10202 1727204075.72068: handler run complete 10202 1727204075.72103: attempt loop complete, returning result 10202 1727204075.72350: _execute() done 10202 1727204075.72353: dumping result to json 10202 1727204075.72356: done dumping result, returning 10202 1727204075.72359: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-0b04-2570-00000000008f] 10202 1727204075.72361: sending task result for task 127b8e07-fff9-0b04-2570-00000000008f 10202 1727204075.72450: done sending task result for task 127b8e07-fff9-0b04-2570-00000000008f 10202 1727204075.72459: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 10202 1727204075.72574: no more pending results, returning what we have 10202 1727204075.72578: results queue empty 10202 1727204075.72580: checking for any_errors_fatal 10202 1727204075.72590: done checking for any_errors_fatal 10202 1727204075.72591: checking for max_fail_percentage 10202 1727204075.72594: done checking for max_fail_percentage 10202 1727204075.72595: checking to see if all hosts have failed and the running result is not ok 10202 1727204075.72596: done checking to see if all hosts have failed 10202 1727204075.72597: getting the remaining hosts for this loop 10202 1727204075.72599: done getting the remaining hosts for this loop 10202 1727204075.72604: getting the next task for host managed-node3 10202 1727204075.72611: done getting next task for host managed-node3 10202 1727204075.72616: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10202 1727204075.72620: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204075.72634: getting variables 10202 1727204075.72636: in VariableManager get_vars() 10202 1727204075.72817: Calling all_inventory to load vars for managed-node3 10202 1727204075.72820: Calling groups_inventory to load vars for managed-node3 10202 1727204075.72822: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204075.72834: Calling all_plugins_play to load vars for managed-node3 10202 1727204075.72844: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204075.72847: Calling groups_plugins_play to load vars for managed-node3 10202 1727204075.76996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204075.81359: done with get_vars() 10202 1727204075.81512: done getting variables 10202 1727204075.81697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.160) 0:00:37.492 ***** 10202 1727204075.81738: entering _queue_task() for managed-node3/debug 10202 1727204075.82535: worker is 1 (out of 1 available) 10202 1727204075.82549: exiting _queue_task() for managed-node3/debug 10202 1727204075.82565: done queuing things up, now waiting for results queue to drain 10202 1727204075.82568: waiting for pending results... 10202 1727204075.83218: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10202 1727204075.83898: in run() - task 127b8e07-fff9-0b04-2570-000000000090 10202 1727204075.83902: variable 'ansible_search_path' from source: unknown 10202 1727204075.83906: variable 'ansible_search_path' from source: unknown 10202 1727204075.83909: calling self._execute() 10202 1727204075.83956: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.83961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.83972: variable 'omit' from source: magic vars 10202 1727204075.84878: variable 'ansible_distribution_major_version' from source: facts 10202 1727204075.84911: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204075.85172: variable 'network_state' from source: role '' defaults 10202 1727204075.85176: Evaluated conditional (network_state != {}): False 10202 1727204075.85179: when evaluation is False, skipping this task 10202 1727204075.85182: _execute() done 10202 1727204075.85184: dumping result to json 10202 1727204075.85186: done dumping result, returning 10202 1727204075.85189: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-0b04-2570-000000000090] 10202 1727204075.85192: sending task result for task 127b8e07-fff9-0b04-2570-000000000090 10202 1727204075.85541: done sending task result for task 127b8e07-fff9-0b04-2570-000000000090 10202 1727204075.85545: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 10202 1727204075.85602: no more pending results, returning what we have 10202 1727204075.85605: results queue empty 10202 1727204075.85606: checking for any_errors_fatal 10202 1727204075.85613: done checking for any_errors_fatal 10202 1727204075.85614: checking for max_fail_percentage 10202 1727204075.85616: done checking for max_fail_percentage 10202 1727204075.85617: checking to see if all hosts have failed and the running result is not ok 10202 1727204075.85618: done checking to see if all hosts have failed 10202 1727204075.85618: getting the remaining hosts for this loop 10202 1727204075.85620: done getting the remaining hosts for this loop 10202 1727204075.85623: getting the next task for host managed-node3 10202 1727204075.85629: done getting next task for host managed-node3 10202 1727204075.85633: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10202 1727204075.85637: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204075.85654: getting variables 10202 1727204075.85656: in VariableManager get_vars() 10202 1727204075.85699: Calling all_inventory to load vars for managed-node3 10202 1727204075.85702: Calling groups_inventory to load vars for managed-node3 10202 1727204075.85705: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204075.85715: Calling all_plugins_play to load vars for managed-node3 10202 1727204075.85718: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204075.85721: Calling groups_plugins_play to load vars for managed-node3 10202 1727204075.89999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204075.94148: done with get_vars() 10202 1727204075.94400: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.127) 0:00:37.620 ***** 10202 1727204075.94516: entering _queue_task() for managed-node3/ping 10202 1727204075.95707: worker is 1 (out of 1 available) 10202 1727204075.95723: exiting _queue_task() for managed-node3/ping 10202 1727204075.95853: done queuing things up, now waiting for results queue to drain 10202 1727204075.95855: waiting for pending results... 10202 1727204075.96339: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 10202 1727204075.96609: in run() - task 127b8e07-fff9-0b04-2570-000000000091 10202 1727204075.96642: variable 'ansible_search_path' from source: unknown 10202 1727204075.96646: variable 'ansible_search_path' from source: unknown 10202 1727204075.96669: calling self._execute() 10202 1727204075.97044: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.97049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.97052: variable 'omit' from source: magic vars 10202 1727204075.97822: variable 'ansible_distribution_major_version' from source: facts 10202 1727204075.97835: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204075.97848: variable 'omit' from source: magic vars 10202 1727204075.98234: variable 'omit' from source: magic vars 10202 1727204075.98276: variable 'omit' from source: magic vars 10202 1727204075.98442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204075.98583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204075.98605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204075.98770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.98777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204075.98792: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204075.98795: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.98798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204075.99258: Set connection var ansible_shell_type to sh 10202 1727204075.99269: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204075.99387: Set connection var ansible_connection to ssh 10202 1727204075.99473: Set connection var ansible_shell_executable to /bin/sh 10202 1727204075.99476: Set connection var ansible_pipelining to False 10202 1727204075.99478: Set connection var ansible_timeout to 10 10202 1727204075.99480: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.99483: variable 'ansible_connection' from source: unknown 10202 1727204075.99488: variable 'ansible_module_compression' from source: unknown 10202 1727204075.99489: variable 'ansible_shell_type' from source: unknown 10202 1727204075.99491: variable 'ansible_shell_executable' from source: unknown 10202 1727204075.99493: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204075.99495: variable 'ansible_pipelining' from source: unknown 10202 1727204075.99496: variable 'ansible_timeout' from source: unknown 10202 1727204075.99498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204076.00201: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10202 1727204076.00206: variable 'omit' from source: magic vars 10202 1727204076.00209: starting attempt loop 10202 1727204076.00211: running the handler 10202 1727204076.00213: _low_level_execute_command(): starting 10202 1727204076.00216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204076.01954: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.01976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.02226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.04090: stdout chunk (state=3): >>>/root <<< 10202 1727204076.04574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.04579: stdout chunk (state=3): >>><<< 10202 1727204076.04582: stderr chunk (state=3): >>><<< 10202 1727204076.04585: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.04588: _low_level_execute_command(): starting 10202 1727204076.04591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088 `" && echo ansible-tmp-1727204076.0453796-12245-273833791217088="` echo /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088 `" ) && sleep 0' 10202 1727204076.06055: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.06177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.06356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.06373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.06612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.08754: stdout chunk (state=3): >>>ansible-tmp-1727204076.0453796-12245-273833791217088=/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088 <<< 10202 1727204076.08953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.09004: stdout chunk (state=3): >>><<< 10202 1727204076.09019: stderr chunk (state=3): >>><<< 10202 1727204076.09051: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.0453796-12245-273833791217088=/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.09190: variable 'ansible_module_compression' from source: unknown 10202 1727204076.09193: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10202 1727204076.09688: variable 'ansible_facts' from source: unknown 10202 1727204076.09691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py 10202 1727204076.10196: Sending initial data 10202 1727204076.10200: Sent initial data (153 bytes) 10202 1727204076.11794: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.12075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.12092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.12117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.12343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.14107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 10202 1727204076.14114: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204076.14438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py" <<< 10202 1727204076.14450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpqry3uns_ /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py <<< 10202 1727204076.14819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpqry3uns_" to remote "/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py" <<< 10202 1727204076.16793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.17074: stderr chunk (state=3): >>><<< 10202 1727204076.17078: stdout chunk (state=3): >>><<< 10202 1727204076.17080: done transferring module to remote 10202 1727204076.17082: _low_level_execute_command(): starting 10202 1727204076.17085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/ /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py && sleep 0' 10202 1727204076.18558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.18642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.18955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.19091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.19422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.21506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.21511: stdout chunk (state=3): >>><<< 10202 1727204076.21513: stderr chunk (state=3): >>><<< 10202 1727204076.21553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.21591: _low_level_execute_command(): starting 10202 1727204076.21647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/AnsiballZ_ping.py && sleep 0' 10202 1727204076.23160: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204076.23235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.23332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.23451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.23564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.23788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.41549: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10202 1727204076.42954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.42986: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 10202 1727204076.43192: stderr chunk (state=3): >>><<< 10202 1727204076.43198: stdout chunk (state=3): >>><<< 10202 1727204076.43376: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204076.43384: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204076.43388: _low_level_execute_command(): starting 10202 1727204076.43392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.0453796-12245-273833791217088/ > /dev/null 2>&1 && sleep 0' 10202 1727204076.44557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204076.44561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.44695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.44788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.44904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.44978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.47606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.47612: stdout chunk (state=3): >>><<< 10202 1727204076.47614: stderr chunk (state=3): >>><<< 10202 1727204076.47616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.47623: handler run complete 10202 1727204076.47625: attempt loop complete, returning result 10202 1727204076.47629: _execute() done 10202 1727204076.47632: dumping result to json 10202 1727204076.47634: done dumping result, returning 10202 1727204076.47636: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-0b04-2570-000000000091] 10202 1727204076.47638: sending task result for task 127b8e07-fff9-0b04-2570-000000000091 10202 1727204076.47774: done sending task result for task 127b8e07-fff9-0b04-2570-000000000091 ok: [managed-node3] => { "changed": false, "ping": "pong" } 10202 1727204076.47850: no more pending results, returning what we have 10202 1727204076.47853: results queue empty 10202 1727204076.47855: checking for any_errors_fatal 10202 1727204076.47862: done checking for any_errors_fatal 10202 1727204076.47863: checking for max_fail_percentage 10202 1727204076.47867: done checking for max_fail_percentage 10202 1727204076.47869: checking to see if all hosts have failed and the running result is not ok 10202 1727204076.47870: done checking to see if all hosts have failed 10202 1727204076.47871: getting the remaining hosts for this loop 10202 1727204076.47873: done getting the remaining hosts for this loop 10202 1727204076.47878: getting the next task for host managed-node3 10202 1727204076.47887: done getting next task for host managed-node3 10202 1727204076.47890: ^ task is: TASK: meta (role_complete) 10202 1727204076.47894: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204076.47911: getting variables 10202 1727204076.47912: in VariableManager get_vars() 10202 1727204076.47956: Calling all_inventory to load vars for managed-node3 10202 1727204076.47959: Calling groups_inventory to load vars for managed-node3 10202 1727204076.47961: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204076.48102: Calling all_plugins_play to load vars for managed-node3 10202 1727204076.48106: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204076.48111: Calling groups_plugins_play to load vars for managed-node3 10202 1727204076.48643: WORKER PROCESS EXITING 10202 1727204076.50112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204076.53188: done with get_vars() 10202 1727204076.53233: done getting variables 10202 1727204076.53350: done queuing things up, now waiting for results queue to drain 10202 1727204076.53353: results queue empty 10202 1727204076.53353: checking for any_errors_fatal 10202 1727204076.53357: done checking for any_errors_fatal 10202 1727204076.53358: checking for max_fail_percentage 10202 1727204076.53359: done checking for max_fail_percentage 10202 1727204076.53360: checking to see if all hosts have failed and the running result is not ok 10202 1727204076.53361: done checking to see if all hosts have failed 10202 1727204076.53361: getting the remaining hosts for this loop 10202 1727204076.53362: done getting the remaining hosts for this loop 10202 1727204076.53367: getting the next task for host managed-node3 10202 1727204076.53372: done getting next task for host managed-node3 10202 1727204076.53375: ^ task is: TASK: Delete the device '{{ controller_device }}' 10202 1727204076.53378: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204076.53380: getting variables 10202 1727204076.53381: in VariableManager get_vars() 10202 1727204076.53397: Calling all_inventory to load vars for managed-node3 10202 1727204076.53400: Calling groups_inventory to load vars for managed-node3 10202 1727204076.53402: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204076.53407: Calling all_plugins_play to load vars for managed-node3 10202 1727204076.53409: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204076.53412: Calling groups_plugins_play to load vars for managed-node3 10202 1727204076.56362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204076.61799: done with get_vars() 10202 1727204076.62100: done getting variables 10202 1727204076.62455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10202 1727204076.62719: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.682) 0:00:38.302 ***** 10202 1727204076.62755: entering _queue_task() for managed-node3/command 10202 1727204076.63782: worker is 1 (out of 1 available) 10202 1727204076.63799: exiting _queue_task() for managed-node3/command 10202 1727204076.63815: done queuing things up, now waiting for results queue to drain 10202 1727204076.63816: waiting for pending results... 10202 1727204076.64369: running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' 10202 1727204076.64717: in run() - task 127b8e07-fff9-0b04-2570-0000000000c1 10202 1727204076.64722: variable 'ansible_search_path' from source: unknown 10202 1727204076.64791: calling self._execute() 10202 1727204076.65076: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204076.65080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204076.65261: variable 'omit' from source: magic vars 10202 1727204076.66030: variable 'ansible_distribution_major_version' from source: facts 10202 1727204076.66052: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204076.66067: variable 'omit' from source: magic vars 10202 1727204076.66096: variable 'omit' from source: magic vars 10202 1727204076.66222: variable 'controller_device' from source: play vars 10202 1727204076.66248: variable 'omit' from source: magic vars 10202 1727204076.66301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204076.66351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204076.66381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204076.66406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204076.66425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204076.66472: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204076.66483: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204076.66547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204076.66621: Set connection var ansible_shell_type to sh 10202 1727204076.66634: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204076.66645: Set connection var ansible_connection to ssh 10202 1727204076.66664: Set connection var ansible_shell_executable to /bin/sh 10202 1727204076.66677: Set connection var ansible_pipelining to False 10202 1727204076.66687: Set connection var ansible_timeout to 10 10202 1727204076.66726: variable 'ansible_shell_executable' from source: unknown 10202 1727204076.66736: variable 'ansible_connection' from source: unknown 10202 1727204076.66744: variable 'ansible_module_compression' from source: unknown 10202 1727204076.66764: variable 'ansible_shell_type' from source: unknown 10202 1727204076.66766: variable 'ansible_shell_executable' from source: unknown 10202 1727204076.66878: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204076.66881: variable 'ansible_pipelining' from source: unknown 10202 1727204076.66883: variable 'ansible_timeout' from source: unknown 10202 1727204076.66885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204076.66948: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204076.66968: variable 'omit' from source: magic vars 10202 1727204076.66982: starting attempt loop 10202 1727204076.66992: running the handler 10202 1727204076.67011: _low_level_execute_command(): starting 10202 1727204076.67021: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204076.67950: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.68032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.68189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.68498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.70423: stdout chunk (state=3): >>>/root <<< 10202 1727204076.70627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.70634: stdout chunk (state=3): >>><<< 10202 1727204076.70637: stderr chunk (state=3): >>><<< 10202 1727204076.70639: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.70642: _low_level_execute_command(): starting 10202 1727204076.70645: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093 `" && echo ansible-tmp-1727204076.7060475-12350-68514857913093="` echo /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093 `" ) && sleep 0' 10202 1727204076.71334: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204076.71341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204076.71352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204076.71370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204076.71394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204076.71402: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204076.71412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.71434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204076.71471: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204076.71474: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204076.71477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204076.71479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204076.71482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204076.71487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204076.71498: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204076.71544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.71608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.71613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.71624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.71724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.73944: stdout chunk (state=3): >>>ansible-tmp-1727204076.7060475-12350-68514857913093=/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093 <<< 10202 1727204076.74173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.74187: stderr chunk (state=3): >>><<< 10202 1727204076.74190: stdout chunk (state=3): >>><<< 10202 1727204076.74224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.7060475-12350-68514857913093=/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.74263: variable 'ansible_module_compression' from source: unknown 10202 1727204076.74368: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204076.74372: variable 'ansible_facts' from source: unknown 10202 1727204076.74444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py 10202 1727204076.74985: Sending initial data 10202 1727204076.74989: Sent initial data (155 bytes) 10202 1727204076.75513: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204076.75522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204076.75536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204076.75556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204076.75572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204076.75575: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204076.75639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.75643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204076.75645: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204076.75647: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204076.75803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204076.75807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.75809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.75811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.75854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.77734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204076.77938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204076.78020: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpv03_0ekt /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py <<< 10202 1727204076.78024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py" <<< 10202 1727204076.78068: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpv03_0ekt" to remote "/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py" <<< 10202 1727204076.79351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.79615: stderr chunk (state=3): >>><<< 10202 1727204076.79619: stdout chunk (state=3): >>><<< 10202 1727204076.79622: done transferring module to remote 10202 1727204076.79624: _low_level_execute_command(): starting 10202 1727204076.79626: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/ /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py && sleep 0' 10202 1727204076.80228: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204076.80251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204076.80278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204076.80311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204076.80401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.80432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.80546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204076.82678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204076.82706: stdout chunk (state=3): >>><<< 10202 1727204076.82710: stderr chunk (state=3): >>><<< 10202 1727204076.82727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204076.82822: _low_level_execute_command(): starting 10202 1727204076.82826: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/AnsiballZ_command.py && sleep 0' 10202 1727204076.83486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204076.83508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204076.83524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204076.83636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.04189: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:37.031000", "end": "2024-09-24 14:54:37.039796", "delta": "0:00:00.008796", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204077.05917: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.169 closed. <<< 10202 1727204077.06804: stderr chunk (state=3): >>><<< 10202 1727204077.06809: stdout chunk (state=3): >>><<< 10202 1727204077.06814: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:37.031000", "end": "2024-09-24 14:54:37.039796", "delta": "0:00:00.008796", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.169 closed. 10202 1727204077.06817: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204077.06826: _low_level_execute_command(): starting 10202 1727204077.06829: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.7060475-12350-68514857913093/ > /dev/null 2>&1 && sleep 0' 10202 1727204077.08322: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.08329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.08333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.08545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.08549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.11522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.11671: stderr chunk (state=3): >>><<< 10202 1727204077.11724: stdout chunk (state=3): >>><<< 10202 1727204077.11973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.11977: handler run complete 10202 1727204077.11980: Evaluated conditional (False): False 10202 1727204077.11982: Evaluated conditional (False): False 10202 1727204077.11986: attempt loop complete, returning result 10202 1727204077.11988: _execute() done 10202 1727204077.11991: dumping result to json 10202 1727204077.11993: done dumping result, returning 10202 1727204077.11996: done running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' [127b8e07-fff9-0b04-2570-0000000000c1] 10202 1727204077.11998: sending task result for task 127b8e07-fff9-0b04-2570-0000000000c1 10202 1727204077.12088: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000c1 10202 1727204077.12093: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.008796", "end": "2024-09-24 14:54:37.039796", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:37.031000" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 10202 1727204077.12172: no more pending results, returning what we have 10202 1727204077.12176: results queue empty 10202 1727204077.12177: checking for any_errors_fatal 10202 1727204077.12179: done checking for any_errors_fatal 10202 1727204077.12180: checking for max_fail_percentage 10202 1727204077.12182: done checking for max_fail_percentage 10202 1727204077.12183: checking to see if all hosts have failed and the running result is not ok 10202 1727204077.12184: done checking to see if all hosts have failed 10202 1727204077.12185: getting the remaining hosts for this loop 10202 1727204077.12187: done getting the remaining hosts for this loop 10202 1727204077.12193: getting the next task for host managed-node3 10202 1727204077.12204: done getting next task for host managed-node3 10202 1727204077.12207: ^ task is: TASK: Remove test interfaces 10202 1727204077.12212: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204077.12217: getting variables 10202 1727204077.12219: in VariableManager get_vars() 10202 1727204077.12264: Calling all_inventory to load vars for managed-node3 10202 1727204077.12387: Calling groups_inventory to load vars for managed-node3 10202 1727204077.12391: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204077.12405: Calling all_plugins_play to load vars for managed-node3 10202 1727204077.12408: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204077.12412: Calling groups_plugins_play to load vars for managed-node3 10202 1727204077.22701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204077.24853: done with get_vars() 10202 1727204077.24896: done getting variables 10202 1727204077.24956: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.622) 0:00:38.925 ***** 10202 1727204077.24992: entering _queue_task() for managed-node3/shell 10202 1727204077.25797: worker is 1 (out of 1 available) 10202 1727204077.25814: exiting _queue_task() for managed-node3/shell 10202 1727204077.25833: done queuing things up, now waiting for results queue to drain 10202 1727204077.25834: waiting for pending results... 10202 1727204077.26486: running TaskExecutor() for managed-node3/TASK: Remove test interfaces 10202 1727204077.26684: in run() - task 127b8e07-fff9-0b04-2570-0000000000c5 10202 1727204077.26775: variable 'ansible_search_path' from source: unknown 10202 1727204077.26780: variable 'ansible_search_path' from source: unknown 10202 1727204077.26836: calling self._execute() 10202 1727204077.27052: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.27056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.27059: variable 'omit' from source: magic vars 10202 1727204077.27734: variable 'ansible_distribution_major_version' from source: facts 10202 1727204077.27756: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204077.27772: variable 'omit' from source: magic vars 10202 1727204077.27851: variable 'omit' from source: magic vars 10202 1727204077.28045: variable 'dhcp_interface1' from source: play vars 10202 1727204077.28055: variable 'dhcp_interface2' from source: play vars 10202 1727204077.28081: variable 'omit' from source: magic vars 10202 1727204077.28132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204077.28181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204077.28205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204077.28249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204077.28253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204077.28285: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204077.28293: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.28300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.28466: Set connection var ansible_shell_type to sh 10202 1727204077.28471: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204077.28474: Set connection var ansible_connection to ssh 10202 1727204077.28477: Set connection var ansible_shell_executable to /bin/sh 10202 1727204077.28479: Set connection var ansible_pipelining to False 10202 1727204077.28481: Set connection var ansible_timeout to 10 10202 1727204077.28501: variable 'ansible_shell_executable' from source: unknown 10202 1727204077.28507: variable 'ansible_connection' from source: unknown 10202 1727204077.28513: variable 'ansible_module_compression' from source: unknown 10202 1727204077.28519: variable 'ansible_shell_type' from source: unknown 10202 1727204077.28524: variable 'ansible_shell_executable' from source: unknown 10202 1727204077.28532: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.28539: variable 'ansible_pipelining' from source: unknown 10202 1727204077.28545: variable 'ansible_timeout' from source: unknown 10202 1727204077.28553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.28872: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204077.28875: variable 'omit' from source: magic vars 10202 1727204077.28878: starting attempt loop 10202 1727204077.28881: running the handler 10202 1727204077.28883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204077.28886: _low_level_execute_command(): starting 10202 1727204077.28888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204077.29677: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.29743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.29789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.29897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.31769: stdout chunk (state=3): >>>/root <<< 10202 1727204077.31976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.32002: stdout chunk (state=3): >>><<< 10202 1727204077.32017: stderr chunk (state=3): >>><<< 10202 1727204077.32160: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.32164: _low_level_execute_command(): starting 10202 1727204077.32170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010 `" && echo ansible-tmp-1727204077.3205185-12464-273628306773010="` echo /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010 `" ) && sleep 0' 10202 1727204077.32786: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.32842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.32850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.32887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.32974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.35234: stdout chunk (state=3): >>>ansible-tmp-1727204077.3205185-12464-273628306773010=/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010 <<< 10202 1727204077.35404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.35408: stdout chunk (state=3): >>><<< 10202 1727204077.35410: stderr chunk (state=3): >>><<< 10202 1727204077.35571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.3205185-12464-273628306773010=/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.35575: variable 'ansible_module_compression' from source: unknown 10202 1727204077.35577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204077.35593: variable 'ansible_facts' from source: unknown 10202 1727204077.35671: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py 10202 1727204077.35941: Sending initial data 10202 1727204077.35945: Sent initial data (156 bytes) 10202 1727204077.36593: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.36615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.36688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.36749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.36813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.36897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.38920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204077.38988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204077.39059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7rpd_54n /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py <<< 10202 1727204077.39062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py" <<< 10202 1727204077.39129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmp7rpd_54n" to remote "/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py" <<< 10202 1727204077.39979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.40101: stderr chunk (state=3): >>><<< 10202 1727204077.40117: stdout chunk (state=3): >>><<< 10202 1727204077.40149: done transferring module to remote 10202 1727204077.40170: _low_level_execute_command(): starting 10202 1727204077.40181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/ /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py && sleep 0' 10202 1727204077.40845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.40863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.40882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.40903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204077.40921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204077.40934: stderr chunk (state=3): >>>debug2: match not found <<< 10202 1727204077.40949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.40971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10202 1727204077.40985: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 10202 1727204077.40997: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10202 1727204077.41009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.41024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.41081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.41127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.41147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.41161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.41277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.43416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.43429: stdout chunk (state=3): >>><<< 10202 1727204077.43450: stderr chunk (state=3): >>><<< 10202 1727204077.43474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.43483: _low_level_execute_command(): starting 10202 1727204077.43493: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/AnsiballZ_command.py && sleep 0' 10202 1727204077.44365: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.44372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.44374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.44377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 10202 1727204077.44379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.44423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.44435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.44461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.44583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.67236: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:37.623230", "end": "2024-09-24 14:54:37.668582", "delta": "0:00:00.045352", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204077.70683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204077.70739: stderr chunk (state=3): >>><<< 10202 1727204077.70742: stdout chunk (state=3): >>><<< 10202 1727204077.70852: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:37.623230", "end": "2024-09-24 14:54:37.668582", "delta": "0:00:00.045352", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204077.70861: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204077.70864: _low_level_execute_command(): starting 10202 1727204077.70869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.3205185-12464-273628306773010/ > /dev/null 2>&1 && sleep 0' 10202 1727204077.71507: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.71695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.71699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.71701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.71703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.71758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.73939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.73944: stdout chunk (state=3): >>><<< 10202 1727204077.73947: stderr chunk (state=3): >>><<< 10202 1727204077.73982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.73996: handler run complete 10202 1727204077.74030: Evaluated conditional (False): False 10202 1727204077.74049: attempt loop complete, returning result 10202 1727204077.74060: _execute() done 10202 1727204077.74079: dumping result to json 10202 1727204077.74170: done dumping result, returning 10202 1727204077.74174: done running TaskExecutor() for managed-node3/TASK: Remove test interfaces [127b8e07-fff9-0b04-2570-0000000000c5] 10202 1727204077.74178: sending task result for task 127b8e07-fff9-0b04-2570-0000000000c5 10202 1727204077.74267: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000c5 10202 1727204077.74271: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.045352", "end": "2024-09-24 14:54:37.668582", "rc": 0, "start": "2024-09-24 14:54:37.623230" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 10202 1727204077.74354: no more pending results, returning what we have 10202 1727204077.74358: results queue empty 10202 1727204077.74359: checking for any_errors_fatal 10202 1727204077.74374: done checking for any_errors_fatal 10202 1727204077.74375: checking for max_fail_percentage 10202 1727204077.74377: done checking for max_fail_percentage 10202 1727204077.74378: checking to see if all hosts have failed and the running result is not ok 10202 1727204077.74379: done checking to see if all hosts have failed 10202 1727204077.74382: getting the remaining hosts for this loop 10202 1727204077.74383: done getting the remaining hosts for this loop 10202 1727204077.74388: getting the next task for host managed-node3 10202 1727204077.74396: done getting next task for host managed-node3 10202 1727204077.74399: ^ task is: TASK: Stop dnsmasq/radvd services 10202 1727204077.74404: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204077.74410: getting variables 10202 1727204077.74412: in VariableManager get_vars() 10202 1727204077.74462: Calling all_inventory to load vars for managed-node3 10202 1727204077.74674: Calling groups_inventory to load vars for managed-node3 10202 1727204077.74678: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204077.74691: Calling all_plugins_play to load vars for managed-node3 10202 1727204077.74694: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204077.74697: Calling groups_plugins_play to load vars for managed-node3 10202 1727204077.76943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204077.79227: done with get_vars() 10202 1727204077.79255: done getting variables 10202 1727204077.79329: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.543) 0:00:39.469 ***** 10202 1727204077.79367: entering _queue_task() for managed-node3/shell 10202 1727204077.79760: worker is 1 (out of 1 available) 10202 1727204077.79977: exiting _queue_task() for managed-node3/shell 10202 1727204077.79989: done queuing things up, now waiting for results queue to drain 10202 1727204077.79990: waiting for pending results... 10202 1727204077.80304: running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services 10202 1727204077.80310: in run() - task 127b8e07-fff9-0b04-2570-0000000000c6 10202 1727204077.80313: variable 'ansible_search_path' from source: unknown 10202 1727204077.80316: variable 'ansible_search_path' from source: unknown 10202 1727204077.80380: calling self._execute() 10202 1727204077.80440: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.80447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.80459: variable 'omit' from source: magic vars 10202 1727204077.80891: variable 'ansible_distribution_major_version' from source: facts 10202 1727204077.80903: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204077.80945: variable 'omit' from source: magic vars 10202 1727204077.80976: variable 'omit' from source: magic vars 10202 1727204077.81047: variable 'omit' from source: magic vars 10202 1727204077.81064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204077.81109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204077.81154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204077.81450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204077.81455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204077.81458: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204077.81462: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.81466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.81470: Set connection var ansible_shell_type to sh 10202 1727204077.81473: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204077.81477: Set connection var ansible_connection to ssh 10202 1727204077.81480: Set connection var ansible_shell_executable to /bin/sh 10202 1727204077.81483: Set connection var ansible_pipelining to False 10202 1727204077.81486: Set connection var ansible_timeout to 10 10202 1727204077.81496: variable 'ansible_shell_executable' from source: unknown 10202 1727204077.81500: variable 'ansible_connection' from source: unknown 10202 1727204077.81503: variable 'ansible_module_compression' from source: unknown 10202 1727204077.81506: variable 'ansible_shell_type' from source: unknown 10202 1727204077.81509: variable 'ansible_shell_executable' from source: unknown 10202 1727204077.81512: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204077.81516: variable 'ansible_pipelining' from source: unknown 10202 1727204077.81519: variable 'ansible_timeout' from source: unknown 10202 1727204077.81522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204077.81654: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204077.81658: variable 'omit' from source: magic vars 10202 1727204077.81661: starting attempt loop 10202 1727204077.81663: running the handler 10202 1727204077.81668: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204077.81672: _low_level_execute_command(): starting 10202 1727204077.81674: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204077.83187: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.83276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.83530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.83618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.85508: stdout chunk (state=3): >>>/root <<< 10202 1727204077.85810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.85814: stdout chunk (state=3): >>><<< 10202 1727204077.85817: stderr chunk (state=3): >>><<< 10202 1727204077.85821: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.85824: _low_level_execute_command(): starting 10202 1727204077.85826: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240 `" && echo ansible-tmp-1727204077.8579514-12516-54413927257240="` echo /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240 `" ) && sleep 0' 10202 1727204077.87040: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.87088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.87138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.87236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.87359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.89499: stdout chunk (state=3): >>>ansible-tmp-1727204077.8579514-12516-54413927257240=/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240 <<< 10202 1727204077.89670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.89883: stderr chunk (state=3): >>><<< 10202 1727204077.89889: stdout chunk (state=3): >>><<< 10202 1727204077.90166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.8579514-12516-54413927257240=/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.90172: variable 'ansible_module_compression' from source: unknown 10202 1727204077.90175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204077.90177: variable 'ansible_facts' from source: unknown 10202 1727204077.90179: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py 10202 1727204077.90460: Sending initial data 10202 1727204077.90473: Sent initial data (155 bytes) 10202 1727204077.91025: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.91054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204077.91080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.91164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.91205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.91283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.93121: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10202 1727204077.93161: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10202 1727204077.93170: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204077.93243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204077.93323: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpwyixsaq8 /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py <<< 10202 1727204077.93327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py" <<< 10202 1727204077.93378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpwyixsaq8" to remote "/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py" <<< 10202 1727204077.94277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.94401: stderr chunk (state=3): >>><<< 10202 1727204077.94544: stdout chunk (state=3): >>><<< 10202 1727204077.94548: done transferring module to remote 10202 1727204077.94550: _low_level_execute_command(): starting 10202 1727204077.94553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/ /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py && sleep 0' 10202 1727204077.95161: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.95225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204077.95291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.95317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.95347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.95460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204077.97632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204077.97642: stdout chunk (state=3): >>><<< 10202 1727204077.97645: stderr chunk (state=3): >>><<< 10202 1727204077.97668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204077.97687: _low_level_execute_command(): starting 10202 1727204077.97706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/AnsiballZ_command.py && sleep 0' 10202 1727204077.98450: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204077.98468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204077.98497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204077.98621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204077.98638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204077.98657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204077.98773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.19902: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:38.165382", "end": "2024-09-24 14:54:38.196973", "delta": "0:00:00.031591", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204078.21801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 10202 1727204078.21805: stdout chunk (state=3): >>><<< 10202 1727204078.21808: stderr chunk (state=3): >>><<< 10202 1727204078.21872: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:38.165382", "end": "2024-09-24 14:54:38.196973", "delta": "0:00:00.031591", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204078.21895: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204078.21903: _low_level_execute_command(): starting 10202 1727204078.22071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.8579514-12516-54413927257240/ > /dev/null 2>&1 && sleep 0' 10202 1727204078.22610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204078.22620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204078.22642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204078.22659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204078.22674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204078.22749: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.22786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204078.22802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.22822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.22926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.25173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.25177: stderr chunk (state=3): >>><<< 10202 1727204078.25179: stdout chunk (state=3): >>><<< 10202 1727204078.25182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204078.25185: handler run complete 10202 1727204078.25188: Evaluated conditional (False): False 10202 1727204078.25190: attempt loop complete, returning result 10202 1727204078.25192: _execute() done 10202 1727204078.25194: dumping result to json 10202 1727204078.25207: done dumping result, returning 10202 1727204078.25217: done running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services [127b8e07-fff9-0b04-2570-0000000000c6] 10202 1727204078.25223: sending task result for task 127b8e07-fff9-0b04-2570-0000000000c6 10202 1727204078.25340: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000c6 10202 1727204078.25344: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.031591", "end": "2024-09-24 14:54:38.196973", "rc": 0, "start": "2024-09-24 14:54:38.165382" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 10202 1727204078.25542: no more pending results, returning what we have 10202 1727204078.25546: results queue empty 10202 1727204078.25547: checking for any_errors_fatal 10202 1727204078.25559: done checking for any_errors_fatal 10202 1727204078.25560: checking for max_fail_percentage 10202 1727204078.25562: done checking for max_fail_percentage 10202 1727204078.25563: checking to see if all hosts have failed and the running result is not ok 10202 1727204078.25564: done checking to see if all hosts have failed 10202 1727204078.25565: getting the remaining hosts for this loop 10202 1727204078.25568: done getting the remaining hosts for this loop 10202 1727204078.25573: getting the next task for host managed-node3 10202 1727204078.25582: done getting next task for host managed-node3 10202 1727204078.25586: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 10202 1727204078.25588: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204078.25593: getting variables 10202 1727204078.25594: in VariableManager get_vars() 10202 1727204078.25638: Calling all_inventory to load vars for managed-node3 10202 1727204078.25641: Calling groups_inventory to load vars for managed-node3 10202 1727204078.25644: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204078.25658: Calling all_plugins_play to load vars for managed-node3 10202 1727204078.25661: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204078.25664: Calling groups_plugins_play to load vars for managed-node3 10202 1727204078.27725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204078.29953: done with get_vars() 10202 1727204078.29997: done getting variables 10202 1727204078.30071: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.507) 0:00:39.976 ***** 10202 1727204078.30112: entering _queue_task() for managed-node3/command 10202 1727204078.30507: worker is 1 (out of 1 available) 10202 1727204078.30636: exiting _queue_task() for managed-node3/command 10202 1727204078.30650: done queuing things up, now waiting for results queue to drain 10202 1727204078.30651: waiting for pending results... 10202 1727204078.31182: running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript 10202 1727204078.31188: in run() - task 127b8e07-fff9-0b04-2570-0000000000c7 10202 1727204078.31192: variable 'ansible_search_path' from source: unknown 10202 1727204078.31195: calling self._execute() 10202 1727204078.31198: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204078.31200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204078.31208: variable 'omit' from source: magic vars 10202 1727204078.31648: variable 'ansible_distribution_major_version' from source: facts 10202 1727204078.31659: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204078.31783: variable 'network_provider' from source: set_fact 10202 1727204078.31787: Evaluated conditional (network_provider == "initscripts"): False 10202 1727204078.31791: when evaluation is False, skipping this task 10202 1727204078.31794: _execute() done 10202 1727204078.31797: dumping result to json 10202 1727204078.31802: done dumping result, returning 10202 1727204078.31813: done running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript [127b8e07-fff9-0b04-2570-0000000000c7] 10202 1727204078.31819: sending task result for task 127b8e07-fff9-0b04-2570-0000000000c7 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10202 1727204078.31990: no more pending results, returning what we have 10202 1727204078.31995: results queue empty 10202 1727204078.31996: checking for any_errors_fatal 10202 1727204078.32011: done checking for any_errors_fatal 10202 1727204078.32012: checking for max_fail_percentage 10202 1727204078.32014: done checking for max_fail_percentage 10202 1727204078.32016: checking to see if all hosts have failed and the running result is not ok 10202 1727204078.32017: done checking to see if all hosts have failed 10202 1727204078.32018: getting the remaining hosts for this loop 10202 1727204078.32020: done getting the remaining hosts for this loop 10202 1727204078.32026: getting the next task for host managed-node3 10202 1727204078.32035: done getting next task for host managed-node3 10202 1727204078.32039: ^ task is: TASK: Verify network state restored to default 10202 1727204078.32043: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204078.32049: getting variables 10202 1727204078.32051: in VariableManager get_vars() 10202 1727204078.32223: Calling all_inventory to load vars for managed-node3 10202 1727204078.32227: Calling groups_inventory to load vars for managed-node3 10202 1727204078.32229: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204078.32245: Calling all_plugins_play to load vars for managed-node3 10202 1727204078.32249: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204078.32253: Calling groups_plugins_play to load vars for managed-node3 10202 1727204078.32775: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000c7 10202 1727204078.32780: WORKER PROCESS EXITING 10202 1727204078.34493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204078.36859: done with get_vars() 10202 1727204078.36935: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.069) 0:00:40.046 ***** 10202 1727204078.37098: entering _queue_task() for managed-node3/include_tasks 10202 1727204078.37530: worker is 1 (out of 1 available) 10202 1727204078.37546: exiting _queue_task() for managed-node3/include_tasks 10202 1727204078.37560: done queuing things up, now waiting for results queue to drain 10202 1727204078.37561: waiting for pending results... 10202 1727204078.38180: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 10202 1727204078.38308: in run() - task 127b8e07-fff9-0b04-2570-0000000000c8 10202 1727204078.38312: variable 'ansible_search_path' from source: unknown 10202 1727204078.38315: calling self._execute() 10202 1727204078.38422: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204078.38439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204078.38455: variable 'omit' from source: magic vars 10202 1727204078.38891: variable 'ansible_distribution_major_version' from source: facts 10202 1727204078.38912: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204078.38923: _execute() done 10202 1727204078.38931: dumping result to json 10202 1727204078.38938: done dumping result, returning 10202 1727204078.38947: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [127b8e07-fff9-0b04-2570-0000000000c8] 10202 1727204078.38960: sending task result for task 127b8e07-fff9-0b04-2570-0000000000c8 10202 1727204078.39325: no more pending results, returning what we have 10202 1727204078.39333: in VariableManager get_vars() 10202 1727204078.39398: Calling all_inventory to load vars for managed-node3 10202 1727204078.39402: Calling groups_inventory to load vars for managed-node3 10202 1727204078.39404: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204078.39422: Calling all_plugins_play to load vars for managed-node3 10202 1727204078.39425: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204078.39432: Calling groups_plugins_play to load vars for managed-node3 10202 1727204078.40184: done sending task result for task 127b8e07-fff9-0b04-2570-0000000000c8 10202 1727204078.40188: WORKER PROCESS EXITING 10202 1727204078.41453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204078.44831: done with get_vars() 10202 1727204078.44856: variable 'ansible_search_path' from source: unknown 10202 1727204078.44876: we have included files to process 10202 1727204078.44877: generating all_blocks data 10202 1727204078.44882: done generating all_blocks data 10202 1727204078.44889: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10202 1727204078.44890: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10202 1727204078.44893: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10202 1727204078.45356: done processing included file 10202 1727204078.45359: iterating over new_blocks loaded from include file 10202 1727204078.45360: in VariableManager get_vars() 10202 1727204078.45384: done with get_vars() 10202 1727204078.45386: filtering new block on tags 10202 1727204078.45425: done filtering new block on tags 10202 1727204078.45430: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 10202 1727204078.45437: extending task lists for all hosts with included blocks 10202 1727204078.47424: done extending task lists 10202 1727204078.47426: done processing included files 10202 1727204078.47427: results queue empty 10202 1727204078.47430: checking for any_errors_fatal 10202 1727204078.47436: done checking for any_errors_fatal 10202 1727204078.47437: checking for max_fail_percentage 10202 1727204078.47439: done checking for max_fail_percentage 10202 1727204078.47439: checking to see if all hosts have failed and the running result is not ok 10202 1727204078.47441: done checking to see if all hosts have failed 10202 1727204078.47441: getting the remaining hosts for this loop 10202 1727204078.47443: done getting the remaining hosts for this loop 10202 1727204078.47446: getting the next task for host managed-node3 10202 1727204078.47451: done getting next task for host managed-node3 10202 1727204078.47454: ^ task is: TASK: Check routes and DNS 10202 1727204078.47457: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204078.47460: getting variables 10202 1727204078.47461: in VariableManager get_vars() 10202 1727204078.47486: Calling all_inventory to load vars for managed-node3 10202 1727204078.47489: Calling groups_inventory to load vars for managed-node3 10202 1727204078.47492: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204078.47500: Calling all_plugins_play to load vars for managed-node3 10202 1727204078.47502: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204078.47506: Calling groups_plugins_play to load vars for managed-node3 10202 1727204078.49560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204078.52103: done with get_vars() 10202 1727204078.52145: done getting variables 10202 1727204078.52207: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.151) 0:00:40.197 ***** 10202 1727204078.52243: entering _queue_task() for managed-node3/shell 10202 1727204078.52796: worker is 1 (out of 1 available) 10202 1727204078.52808: exiting _queue_task() for managed-node3/shell 10202 1727204078.52822: done queuing things up, now waiting for results queue to drain 10202 1727204078.52824: waiting for pending results... 10202 1727204078.53190: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 10202 1727204078.53420: in run() - task 127b8e07-fff9-0b04-2570-00000000056d 10202 1727204078.53424: variable 'ansible_search_path' from source: unknown 10202 1727204078.53430: variable 'ansible_search_path' from source: unknown 10202 1727204078.53433: calling self._execute() 10202 1727204078.53544: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204078.53557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204078.53576: variable 'omit' from source: magic vars 10202 1727204078.54023: variable 'ansible_distribution_major_version' from source: facts 10202 1727204078.54047: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204078.54061: variable 'omit' from source: magic vars 10202 1727204078.54131: variable 'omit' from source: magic vars 10202 1727204078.54185: variable 'omit' from source: magic vars 10202 1727204078.54295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10202 1727204078.54299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10202 1727204078.54311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10202 1727204078.54340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204078.54357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10202 1727204078.54400: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10202 1727204078.54412: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204078.54421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204078.54679: Set connection var ansible_shell_type to sh 10202 1727204078.54693: Set connection var ansible_module_compression to ZIP_DEFLATED 10202 1727204078.54770: Set connection var ansible_connection to ssh 10202 1727204078.54773: Set connection var ansible_shell_executable to /bin/sh 10202 1727204078.54776: Set connection var ansible_pipelining to False 10202 1727204078.55277: Set connection var ansible_timeout to 10 10202 1727204078.55281: variable 'ansible_shell_executable' from source: unknown 10202 1727204078.55284: variable 'ansible_connection' from source: unknown 10202 1727204078.55287: variable 'ansible_module_compression' from source: unknown 10202 1727204078.55289: variable 'ansible_shell_type' from source: unknown 10202 1727204078.55292: variable 'ansible_shell_executable' from source: unknown 10202 1727204078.55295: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204078.55297: variable 'ansible_pipelining' from source: unknown 10202 1727204078.55300: variable 'ansible_timeout' from source: unknown 10202 1727204078.55303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204078.55306: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204078.55319: variable 'omit' from source: magic vars 10202 1727204078.55333: starting attempt loop 10202 1727204078.55391: running the handler 10202 1727204078.55411: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10202 1727204078.55448: _low_level_execute_command(): starting 10202 1727204078.55491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10202 1727204078.56794: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.56875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204078.56897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.57154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.57252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.59099: stdout chunk (state=3): >>>/root <<< 10202 1727204078.59324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.59332: stdout chunk (state=3): >>><<< 10202 1727204078.59381: stderr chunk (state=3): >>><<< 10202 1727204078.59479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204078.59483: _low_level_execute_command(): starting 10202 1727204078.59487: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928 `" && echo ansible-tmp-1727204078.5940897-12594-270041959878928="` echo /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928 `" ) && sleep 0' 10202 1727204078.60949: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10202 1727204078.60954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 10202 1727204078.60969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10202 1727204078.60973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 10202 1727204078.60976: stderr chunk (state=3): >>>debug2: match found <<< 10202 1727204078.60978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.61102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.61106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.61219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.63419: stdout chunk (state=3): >>>ansible-tmp-1727204078.5940897-12594-270041959878928=/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928 <<< 10202 1727204078.63641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.63645: stdout chunk (state=3): >>><<< 10202 1727204078.63648: stderr chunk (state=3): >>><<< 10202 1727204078.63670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204078.5940897-12594-270041959878928=/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204078.63820: variable 'ansible_module_compression' from source: unknown 10202 1727204078.63823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10202puqcm74n/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10202 1727204078.63826: variable 'ansible_facts' from source: unknown 10202 1727204078.63900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py 10202 1727204078.64188: Sending initial data 10202 1727204078.64198: Sent initial data (156 bytes) 10202 1727204078.64847: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204078.64865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204078.64953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.64997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204078.65017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.65051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.65170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.67016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10202 1727204078.67142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10202 1727204078.67222: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10202puqcm74n/tmpb19q1_qd /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py <<< 10202 1727204078.67226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py" <<< 10202 1727204078.67287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-10202puqcm74n/tmpb19q1_qd" to remote "/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py" <<< 10202 1727204078.68251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.68291: stderr chunk (state=3): >>><<< 10202 1727204078.68303: stdout chunk (state=3): >>><<< 10202 1727204078.68341: done transferring module to remote 10202 1727204078.68360: _low_level_execute_command(): starting 10202 1727204078.68372: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/ /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py && sleep 0' 10202 1727204078.69103: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204078.69125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 10202 1727204078.69172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.69187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.69288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.69340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.69399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.71697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.71742: stderr chunk (state=3): >>><<< 10202 1727204078.71872: stdout chunk (state=3): >>><<< 10202 1727204078.71876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204078.71879: _low_level_execute_command(): starting 10202 1727204078.71882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/AnsiballZ_command.py && sleep 0' 10202 1727204078.73415: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 10202 1727204078.73569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.73596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.73649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.73798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.92563: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3219sec preferred_lft 3219sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:38.913298", "end": "2024-09-24 14:54:38.923359", "delta": "0:00:00.010061", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10202 1727204078.94423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.94546: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 10202 1727204078.94566: stdout chunk (state=3): >>><<< 10202 1727204078.94576: stderr chunk (state=3): >>><<< 10202 1727204078.94601: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3219sec preferred_lft 3219sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:38.913298", "end": "2024-09-24 14:54:38.923359", "delta": "0:00:00.010061", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 10202 1727204078.94678: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10202 1727204078.94771: _low_level_execute_command(): starting 10202 1727204078.94774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204078.5940897-12594-270041959878928/ > /dev/null 2>&1 && sleep 0' 10202 1727204078.95545: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 10202 1727204078.95576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10202 1727204078.95681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10202 1727204078.95720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 10202 1727204078.95740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10202 1727204078.95767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10202 1727204078.95989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10202 1727204078.98280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10202 1727204078.98285: stdout chunk (state=3): >>><<< 10202 1727204078.98288: stderr chunk (state=3): >>><<< 10202 1727204078.98291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10202 1727204078.98293: handler run complete 10202 1727204078.98296: Evaluated conditional (False): False 10202 1727204078.98298: attempt loop complete, returning result 10202 1727204078.98300: _execute() done 10202 1727204078.98302: dumping result to json 10202 1727204078.98305: done dumping result, returning 10202 1727204078.98307: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [127b8e07-fff9-0b04-2570-00000000056d] 10202 1727204078.98309: sending task result for task 127b8e07-fff9-0b04-2570-00000000056d 10202 1727204078.98402: done sending task result for task 127b8e07-fff9-0b04-2570-00000000056d 10202 1727204078.98405: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.010061", "end": "2024-09-24 14:54:38.923359", "rc": 0, "start": "2024-09-24 14:54:38.913298" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3219sec preferred_lft 3219sec inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 10202 1727204078.98499: no more pending results, returning what we have 10202 1727204078.98503: results queue empty 10202 1727204078.98504: checking for any_errors_fatal 10202 1727204078.98506: done checking for any_errors_fatal 10202 1727204078.98507: checking for max_fail_percentage 10202 1727204078.98509: done checking for max_fail_percentage 10202 1727204078.98510: checking to see if all hosts have failed and the running result is not ok 10202 1727204078.98511: done checking to see if all hosts have failed 10202 1727204078.98512: getting the remaining hosts for this loop 10202 1727204078.98514: done getting the remaining hosts for this loop 10202 1727204078.98519: getting the next task for host managed-node3 10202 1727204078.98533: done getting next task for host managed-node3 10202 1727204078.98536: ^ task is: TASK: Verify DNS and network connectivity 10202 1727204078.98545: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10202 1727204078.98551: getting variables 10202 1727204078.98553: in VariableManager get_vars() 10202 1727204078.98608: Calling all_inventory to load vars for managed-node3 10202 1727204078.98611: Calling groups_inventory to load vars for managed-node3 10202 1727204078.98614: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204078.98632: Calling all_plugins_play to load vars for managed-node3 10202 1727204078.98635: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204078.98639: Calling groups_plugins_play to load vars for managed-node3 10202 1727204079.01823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204079.04083: done with get_vars() 10202 1727204079.04122: done getting variables 10202 1727204079.04199: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.519) 0:00:40.717 ***** 10202 1727204079.04236: entering _queue_task() for managed-node3/shell 10202 1727204079.04654: worker is 1 (out of 1 available) 10202 1727204079.04672: exiting _queue_task() for managed-node3/shell 10202 1727204079.04686: done queuing things up, now waiting for results queue to drain 10202 1727204079.04688: waiting for pending results... 10202 1727204079.05283: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 10202 1727204079.05288: in run() - task 127b8e07-fff9-0b04-2570-00000000056e 10202 1727204079.05290: variable 'ansible_search_path' from source: unknown 10202 1727204079.05292: variable 'ansible_search_path' from source: unknown 10202 1727204079.05295: calling self._execute() 10202 1727204079.05369: variable 'ansible_host' from source: host vars for 'managed-node3' 10202 1727204079.05382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10202 1727204079.05396: variable 'omit' from source: magic vars 10202 1727204079.05885: variable 'ansible_distribution_major_version' from source: facts 10202 1727204079.05906: Evaluated conditional (ansible_distribution_major_version != '6'): True 10202 1727204079.06093: variable 'ansible_facts' from source: unknown 10202 1727204079.07163: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 10202 1727204079.07176: when evaluation is False, skipping this task 10202 1727204079.07184: _execute() done 10202 1727204079.07191: dumping result to json 10202 1727204079.07201: done dumping result, returning 10202 1727204079.07215: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [127b8e07-fff9-0b04-2570-00000000056e] 10202 1727204079.07231: sending task result for task 127b8e07-fff9-0b04-2570-00000000056e skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 10202 1727204079.07421: no more pending results, returning what we have 10202 1727204079.07425: results queue empty 10202 1727204079.07426: checking for any_errors_fatal 10202 1727204079.07443: done checking for any_errors_fatal 10202 1727204079.07444: checking for max_fail_percentage 10202 1727204079.07446: done checking for max_fail_percentage 10202 1727204079.07447: checking to see if all hosts have failed and the running result is not ok 10202 1727204079.07448: done checking to see if all hosts have failed 10202 1727204079.07449: getting the remaining hosts for this loop 10202 1727204079.07451: done getting the remaining hosts for this loop 10202 1727204079.07456: getting the next task for host managed-node3 10202 1727204079.07470: done getting next task for host managed-node3 10202 1727204079.07472: ^ task is: TASK: meta (flush_handlers) 10202 1727204079.07475: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204079.07480: getting variables 10202 1727204079.07482: in VariableManager get_vars() 10202 1727204079.07535: Calling all_inventory to load vars for managed-node3 10202 1727204079.07538: Calling groups_inventory to load vars for managed-node3 10202 1727204079.07541: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204079.07558: Calling all_plugins_play to load vars for managed-node3 10202 1727204079.07561: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204079.07564: Calling groups_plugins_play to load vars for managed-node3 10202 1727204079.08405: done sending task result for task 127b8e07-fff9-0b04-2570-00000000056e 10202 1727204079.08410: WORKER PROCESS EXITING 10202 1727204079.09713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204079.12089: done with get_vars() 10202 1727204079.12135: done getting variables 10202 1727204079.12220: in VariableManager get_vars() 10202 1727204079.12246: Calling all_inventory to load vars for managed-node3 10202 1727204079.12249: Calling groups_inventory to load vars for managed-node3 10202 1727204079.12251: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204079.12258: Calling all_plugins_play to load vars for managed-node3 10202 1727204079.12261: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204079.12264: Calling groups_plugins_play to load vars for managed-node3 10202 1727204079.13881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204079.16134: done with get_vars() 10202 1727204079.16186: done queuing things up, now waiting for results queue to drain 10202 1727204079.16188: results queue empty 10202 1727204079.16189: checking for any_errors_fatal 10202 1727204079.16193: done checking for any_errors_fatal 10202 1727204079.16193: checking for max_fail_percentage 10202 1727204079.16195: done checking for max_fail_percentage 10202 1727204079.16196: checking to see if all hosts have failed and the running result is not ok 10202 1727204079.16196: done checking to see if all hosts have failed 10202 1727204079.16197: getting the remaining hosts for this loop 10202 1727204079.16198: done getting the remaining hosts for this loop 10202 1727204079.16201: getting the next task for host managed-node3 10202 1727204079.16205: done getting next task for host managed-node3 10202 1727204079.16207: ^ task is: TASK: meta (flush_handlers) 10202 1727204079.16209: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204079.16211: getting variables 10202 1727204079.16212: in VariableManager get_vars() 10202 1727204079.16231: Calling all_inventory to load vars for managed-node3 10202 1727204079.16234: Calling groups_inventory to load vars for managed-node3 10202 1727204079.16236: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204079.16243: Calling all_plugins_play to load vars for managed-node3 10202 1727204079.16246: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204079.16249: Calling groups_plugins_play to load vars for managed-node3 10202 1727204079.17816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204079.20096: done with get_vars() 10202 1727204079.20135: done getting variables 10202 1727204079.20192: in VariableManager get_vars() 10202 1727204079.20212: Calling all_inventory to load vars for managed-node3 10202 1727204079.20214: Calling groups_inventory to load vars for managed-node3 10202 1727204079.20216: Calling all_plugins_inventory to load vars for managed-node3 10202 1727204079.20221: Calling all_plugins_play to load vars for managed-node3 10202 1727204079.20224: Calling groups_plugins_inventory to load vars for managed-node3 10202 1727204079.20226: Calling groups_plugins_play to load vars for managed-node3 10202 1727204079.22163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10202 1727204079.24544: done with get_vars() 10202 1727204079.24594: done queuing things up, now waiting for results queue to drain 10202 1727204079.24596: results queue empty 10202 1727204079.24597: checking for any_errors_fatal 10202 1727204079.24599: done checking for any_errors_fatal 10202 1727204079.24600: checking for max_fail_percentage 10202 1727204079.24601: done checking for max_fail_percentage 10202 1727204079.24602: checking to see if all hosts have failed and the running result is not ok 10202 1727204079.24603: done checking to see if all hosts have failed 10202 1727204079.24603: getting the remaining hosts for this loop 10202 1727204079.24605: done getting the remaining hosts for this loop 10202 1727204079.24615: getting the next task for host managed-node3 10202 1727204079.24619: done getting next task for host managed-node3 10202 1727204079.24620: ^ task is: None 10202 1727204079.24622: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10202 1727204079.24623: done queuing things up, now waiting for results queue to drain 10202 1727204079.24624: results queue empty 10202 1727204079.24624: checking for any_errors_fatal 10202 1727204079.24625: done checking for any_errors_fatal 10202 1727204079.24626: checking for max_fail_percentage 10202 1727204079.24627: done checking for max_fail_percentage 10202 1727204079.24628: checking to see if all hosts have failed and the running result is not ok 10202 1727204079.24628: done checking to see if all hosts have failed 10202 1727204079.24630: getting the next task for host managed-node3 10202 1727204079.24634: done getting next task for host managed-node3 10202 1727204079.24634: ^ task is: None 10202 1727204079.24636: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.205) 0:00:40.922 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.90s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.84s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 2.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install dnsmasq --------------------------------------------------------- 1.71s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install pgrep, sysctl --------------------------------------------------- 1.67s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.42s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.39s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.37s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.36s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.21s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.08s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.78s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.68s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.68s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Check if system is ostree ----------------------------------------------- 0.66s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Delete the device 'nm-bond' --------------------------------------------- 0.62s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Remove test interfaces -------------------------------------------------- 0.54s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Get NM profile info ----------------------------------------------------- 0.54s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 10202 1727204079.24790: RUNNING CLEANUP